Search code examples
arraysgoogle-bigqueryrecord

Is there a maximum array size in BigQuery?


I'm aggregating strings into an array and haven't been able to find any documentation on recommended maximum sizes for arrays. My observations is that with approx 10k elements in one array field in one row, the query runs without issue but the BigQuery UI freezes and becomes unresponsive.

Is this a pure UI issue or there are recommendations on the order of magnitude one should not exceed with BigQuery arrays? I will have hundreds of rows so if there are issue with a single one I need to find a different approach.

Does table size relate similarly to additional elements in the array as it would to additional rows in the table?

Code sample:

select     
  reporting_date
  , array(
      select struct(id, status) 
      from tmp_table
    ) as test_field
from my_table

Solution

    1. This is definitely a UI issue.

    I have never hit an array size limit and there aren't any documented. https://cloud.google.com/bigquery/quotas

    1. Yes, the table size will grow accordingly. For arrays specifically:

    The sum of the size of its elements. For example, an array defined as (ARRAY) that contains 4 entries is calculated as 32 logical bytes (4 entries x 8 logical bytes).

    https://cloud.google.com/bigquery/docs/reference/standard-sql/data-types#data_type_sizes