Search code examples
google-cloud-sqlgoogle-app-maker

Google App Maker saving records to DB is taking 240ms per record


We are having a Google App Maker app using Google Cloud SQL tables. Our location is Prague, Central Europe and saving one record takes incredible 240ms at minimum.

It looks like Google App Maker has a proxy just close to my location (latency ~20ms). However our instance set up in US-Central1 (which is recommened by Google to provide Google App Maker apps with the best performance) shows ~120ms ping.

We have tried many things including changing the instance location to Frankfurt, Germany (~25 ms). In this case, it took even longer (500ms per record). Our queries seemed to be round-tripping this complicated way: Our location (proxy) → US (Google App Maker master) → Frankfurt (SQL instance)

We are quite desperate as we cannot afford to migrate away from Google App Maker at the moment. And I strongly believe this has some solution either now or in the future.

My resources: db-f1-micro (vCPUs: 1), Memory: 614.4 MB, Storage capacity: 10 GB Database version: MySQL 2nd Gen 5.7, Instance zone: us-central1-a Connectivity: Public IP, My location: Prague, Czech Republic

function serverSaveData() {  
  var recordData = [];
  recordData[0] = ['NewField_String']
  recordData[1] = ['NewField1_String'];

  for (var i = 0 ; i < recordData.length; i ++) {
    var newProduct = app.models.testDB.newRecord();
    newProduct.NewField = recordData[i][0];
    newProduct.NewField1 = recordData[i][1];
    var time = new Date().toString();
    app.saveRecords([newProduct]);
    console.log('Product saved: ' + time);
  }

}

We need the speed to be maximum ~25ms per record (one tenth of the actual speed). We have 10000 products and importing should not take more than several minutes.

Would you guys please see any solution, or is there a way to save records to the database in bulk? Thank you so much!


Solution

  • App Maker has ups and downs and you just need to find workarounds for the downs. I just dealt with the import of a huge database that contained over a half million records and it was quite a challenge.

    In your case, I'd recommend to save items in batches to speed up the process. I think that saving records in batches of 2000 should be enough.

    You can do something like this:

    function saveData(){
    
        //get the data
        var data = getData();
    
        var totalRecords = data.length;
        var batchLimit = 2000;
        var totalRounds = Math.ceil(totalRecords / batchLimit);
        var round = 1;
        var roundlimit = batchLimit;
        var currentItem = 0;
    
        do{
            var recordsToSave = [];
            for(var i=currentItem; i<roundlimit; i++){
                var recordData = data[i];
                var newProduct = app.models.testDB.newRecord();
                newProduct.newField = recordData.newFielddData;
                newProduct.newFiled2 = recordData.newField2Data;
                recordsToSave.push(newProduct);
            }
            if(recordsToSave.length){
                app.saveRecords(recordsToSave);
            }
            currentItem += batchLimit;
            roundlimit += batchLimit;
            round++;
    
        } while(round <= totalRounds);
    
    }
    

    I used a similar solution like the one above to complete the import of 680,000+ records to appmaker. It might need a little bit more of tweaking but you get the idea. As for the latency with the connection, sorry I can't help. Hopefully a Google App Maker engineer can pitch in but as far as I see now, this is your best shot.