I'm aggregating strings into an array and haven't been able to find any documentation on recommended maximum sizes for arrays. My observations is that with approx 10k elements in one array field in one row, the query runs without issue but the BigQuery UI freezes and becomes unresponsive.
Is this a pure UI issue or there are recommendations on the order of magnitude one should not exceed with BigQuery arrays? I will have hundreds of rows so if there are issue with a single one I need to find a different approach.
Does table size relate similarly to additional elements in the array as it would to additional rows in the table?
Code sample:
select
reporting_date
, array(
select struct(id, status)
from tmp_table
) as test_field
from my_table
I have never hit an array size limit and there aren't any documented. https://cloud.google.com/bigquery/quotas
The sum of the size of its elements. For example, an array defined as (ARRAY) that contains 4 entries is calculated as 32 logical bytes (4 entries x 8 logical bytes).
https://cloud.google.com/bigquery/docs/reference/standard-sql/data-types#data_type_sizes