Search code examples
phpsqlmemorygoogle-bigquerylarge-data

Google Big Query + PHP -> How to fetch a large data set without running out of memory


I am trying to run a query in BigQuery/PHP (using google php SDK) that returns a large dataset (can be 100,000 - 10,000,000 rows).

$bigqueryService = new Google_BigqueryService($client);

$query = new Google_QueryRequest();
$query->setQuery(...);

$jobs = $bigqueryService->jobs;
$response = $jobs->query($project_id, $query); 
//query is a syncronous function that returns a full dataset

The next step is to allow the user to download the result as a CSV file.

The code above will fail when the dataset becomes too large (memory limit). What are my options to perform this operation with lower memory usage ?

(I figured an option is to save the results to another table with BigQuery and then start doing partial fetch with LIMIT and OFFSET but I figured a better solution might be available..)

Thanks for the help


Solution

  • The suggestion to export is a good one, I just wanted to mention there is another way.

    The query API you are calling (jobs.query()) does not return the full dataset; it just returns a page of data, which is the first 2 MB of the results. You can set the maxResults flag (described here) to limit this to a certain number of rows.

    If you get back fewer rows than are in the table, you will get a pageToken field in the response. You can then fetch the remainder with the jobs.getQueryResults() API by providing the job ID (also in the query response) and the page token. This will continue to return new rows and a new page token until you get to the end of your table.

    The example here shows code (in java in python) to run a query and fetch the results page by page.

    There is also an option in the API to convert directly to CSV by specifying alt='csv' in the URL query string, but I'm not sure how to do this in PHP.