I need to find a tool or a technique that will generate insert statements from a spanner db so I can insert them into another spanner db. I need to selectively choose which insert statements or rows to migrate so the spanner export/import tool will not work. The destination db will already exist and it will have existing data in it. The amount of data is small - roughly 15 tables with 10 to 20 rows in each table. Any suggestions would be greatly appreciated.
You can use the Cloud Spanner Dataflow Connector to write your pipeline/data loader to move data in and out of Spanner. You can use a custom SQL query with the Dataflow reader to read the subset of data that you want to export.
Depending on how wide your tables are, if you are dealing with a relatively small amount of data, a simpler way to this could be using the gcloud spanner databases execute-sql
command-line utility. For each of your tables, you could use the utility to run a SQL query to get the rows you want to export from the table and write the result to a file in the csv format using the --format=csv
argument. Then you could write a small wrapper around Cloud Spanner Insert APIs to read the data from the CSV files and send insert mutations to the target database.