Search code examples
pythongoogle-app-enginecsvblobstore

GAE sending CSV file through POST gives error 500


I'm trying to make an endpoint for my GAE application, where I can problematically send a POST request with a CSV file (as a form) from a client, and then the server will receive it, and store it in Datastore. However, when I try to send a large file, it times out. It seems that the max URL fetch timeout is 59.9 seconds, and that's where it fails.

class CostTest(ndb.Model):
pickUp = ndb.StringProperty()
amount = ndb.StringProperty()

def post(self):
    self.response.write("part 1")
    print self.request.get('type')

    check_values = self.request.POST.getall('file')
    array = list(csv.reader(check_values))
    for c in array:
        pickup, amount = c
        entry = CostTest(pickUp=pickup,
                         amount=amount)
        entry.put()

        #print c
    self.response.write("part 2")
    self.response.write(self.request.get('file'))

app = webapp2.WSGIApplication([
('/csv/order', CsvFileLoader),
('/csv/kiosk', CsvFileLoader)
], debug=True)

The file that I'm using to test has ~4600 rows, however, it's unable to load them all! I tried using Blobstore, however I cannot figure out how to get a file from a POST request (if at all possible?). It seems like the only way to store into blob store is through a user submitted form!

Thanks for the help, in advance.


Solution

  • The problem with your code is not "load them all", but you're doing 4600 synchronous puts! If you batch all of them and use ndb.put_multi, I believe it works. A python script can replicate submitted form, what problem do you have with uploading to blobstore?