Search code examples
google-bigquerymqttmonitoringiotsensors

How to deal with big query error 403: quota exceeded?


We have a python script that subscribes to data published by an IoT device via MQTT and appends this data to a big query table every minute. Therefore, each day we make 1440 append jobs to the big query table which is just below 1500 daily limit set by GCP. This method will surely fail if we have more devices. Does anyone have experience monitoring sensors in real time and storing data on the cloud? and what tools do you use? Thanks


Solution

  • The maximum number of table operations per day is 1,500 in Standard tables. This operation includes the combined total of all load jobs, copy jobs, and query jobs that append to or overwrite a destination table or that use a DML INSERT, UPDATE, DELETE, or MERGE statement to write data to a table. So, if you reach the limit of 1500 table operations per table per day, your BigQuery jobs fail with the “Quota exceeded” error. The following are the recommendation in such situation will be to:

    1. Batch the jobs together to merge the updates into a single update, so that you can overcome the 1500 table operation limit. Refer to this documentation for more information about Batch jobs.

    2. Use the Streaming API if you want real-time updates of the data. Refer to this documentation for more information about the Streaming API.

    As per your requirement, I would recommend you to use the Streaming API to stream real-time updates of the data into BigQuery.