Search code examples
google-cloud-platformgoogle-cloud-bigtable

ways to copy BigTable table without affecting the read latency


I am trying to copy BigTable table from one instance to other but it seems like there is no direct way to do it.

I am exploring Dataflow jobs that export to GCS then to BigTable, but during the export process, I am afraid that this might affect the read latency of the BigTable source table. Is there any way to copy without affecting the performance of the source table? The source table is production data that gets high traffic.


Solution

  • You can also try out the Backups feature, and move the data by creating a backup of the table in the source instance and restore the backup to a new table in the destination instance. This will not affect the performance of the original table.