Search code examples
pythongoogle-bigquerygoogle-cloud-storagegoogle-api-python-client

max bad records in Google bigquery


I had the following error when I tried to load a .csv.gz file as a table from Google storage to Bigquery using the Google Python Api Client:

{u'state': u'DONE', u'errors': [{u'reason': u'invalid', u'message': u'Too many errors encountered. Limit is: {1}.'},
{u'reason': u'invalid', u'message': u'Too many values in row starting at position:64490 in file:/gzip/subrange/gs://my-bucket/myfile.csv.gz'}],
u'errorResult': {u'reason': u'invalid', u'message': u'Too many errors encountered. Limit is: {1}.'}}

My problem is that I specified in the api request that I wanted to allow 100 errors, using the maxBadRecords parameter, as follows:

    MAX_BAD_RECORDS = 100
    body_query = {
        'jobReference': {
            'projectId': self.project_id,
            'jobId': self._job_id
        },
        'configuration': {
            'load': {
                'destinationTable': {
                    'projectId': self.destination_table.project_id,
                    'datasetId': self.destination_table.dataset.id,
                    'tableId': self.destination_table.id,
                },
                'fieldDelimiter': self.delimiter,
                'maxBadRecords': MAX_BAD_RECORDS,
                'schema': {
                    'fields': self.schema
                },
                'sourceUris': self.source_uris,
                'skipLeadingRows': self.skip_leading_rows_number,
                'writeDisposition': self.write_disposition,
                "createDisposition": self.create_disposition,
            }
        }
    }

I think that Google bigQuery python API has a bug and It does not consider that my MAX_BAD_RECORDS is set to 100.

Can someone help me?


Solution

  • I think BQ did honor your MAX_BAD_RECORDS, otherwise you won't see the "Too many errors" message. The "{1}" is probably a placeholder that should be replaced by the real limit but somehow missed.