Context: I want to automate and simulate an export of my prod environment so I can run tests locally and to have PR previews using that automation.
Problems:
I have a bunch of collections having millions of records that I do not want to export nor emulate ( Doing so will require to fix other things like memory utilization, and I don't see having a docker environment with the memory needed to run this db, as a viable option)
The doc indicates you have 2 option to export/import
But there is no black list option, so I have to indicate all my collection groups excluding the ones I don't want to export/import
The console UI gives me the list that I want
Since I want to automate this, what can I use to obtain this list (every tool could work for me, because I am currently using almost all google products). I have been reading a lot and the most viable solution is to have a recursive loop to iterate over all my collections, then downloading all the records ids and iterate over those sub-collections. But I don't think google does that on the console, so there must be some place where google is storing that info. I hope I am not falling into the XY
For now I will just hard code the list I pasted above in my script...
Thanks!
I have tried using the listCollections functions in firestore and documentReference objects, but this requires a recursive iteration over the whole database
there must be some place where google is storing that info
If so, it's not readily available to developers.
I have tried using the listCollections functions in firestore and documentReference objects, but this requires a recursive iteration over the whole database
To be frank: this is your only option.
Consider instead maintaining a list elsewhere and keeping it up to date when your schema changes.