If a valid BigQuery query returns 0 rows, to_dataframe() crashes. (btw, I am running this on Google Cloud Datalab)
for example:
q = bq.Query('SELECT * FROM [isb-cgc:tcga_201510_alpha.Somatic_Mutation_calls] WHERE ( Protein_Change="V600E" ) LIMIT 10')
r = q.results()
r.to_dataframe()
produces:
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-17-de55245104c0> in <module>()
----> 1 r.to_dataframe()
/usr/local/lib/python2.7/dist-packages/gcp/bigquery/_table.pyc in to_dataframe(self, start_row, max_rows)
628 # Need to reorder the dataframe to preserve column ordering
629 ordered_fields = [field.name for field in self.schema]
--> 630 return df[ordered_fields]
631
632 def to_file(self, destination, format='csv', csv_delimiter=',', csv_header=True):
TypeError: 'NoneType' object has no attribute '__getitem__'
is this a known bug?
Certainly not a known bug. Please do log a bug as mentioned by Felipe.
Contributions, both bug reports, and of course fixes, are welcome! :)