I want to save some large json to datastore, where len(json)>=80000000
(80MB) but I am getting: ServiceUnavailable: 503 413:Request Entity Too Large
I could potentially save it in cloud storage instead, but I guess I will lose indexing and faster querying ability from datastore. What's the best solution here?
def save_serialized_data_to_db(json, name):
datastore_client = datastore.Client()
kind = 'SerializedData'
serialized_data_key = datastore_client.key(kind, name)
serialized_data = datastore.Entity(key=serialized_data_key)
serialized_data['json'] = json
datastore_client.put(serialized_data) // getting: ServiceUnavailable: 503 413:Request Entity Too Large
return serialized_data
In my company with perform a special process
We are also looking at using MongoDB. I haven't feedback for you, yet.