Is there a recommended way of exporting firebase events to Google Cloud Storage (for example Parquet format)? If I export my data to BigQuery, what is the best way to have the data consistently pushed to GCP Cloud Storage?
The reason is that I have daraproc jobs dealing with parquet files in Cloud Storage, I want my firebase data to be accessible in the same way.
Exporting data from BigQuery directly as parquet file is not supported currently.
BigQuery supports three format now,
You have option to transform data to parquet file using Apache Beam & Google Cloud Dataflow. Use ParquetIO to transform data after reading data from BigQuery and write it to Cloud Storage.
Exporting Data(BigQuery)
https://cloud.google.com/bigquery/docs/exporting-data#export_formats_and_compression_types
ParquetIO(Apache Beam) https://beam.pache.org/releases/javadoc/2.5.0/org/apache/beam/sdk/io/parquet/ParquetIO.html