We are using Spanner db for staging large data set and we want to truncate the table on schedule basis. The table holds more than 25+ Million rows at anytime. we want to truncate the table without dropping and recreating it. What is the option in GCP to do it.
Currently we running Gcloud command manually with --enable-partitioned-dml option and which is timing out after deleting < 1M rows. Please suggest better way of doing it.
gcloud spanner databases execute-sql db-name --sql="delete from tableName where SpannerCommitTS
Truncate is still not supported in Cloud Spanner. You could try Mutation.delete method with KeyRange or KeySet (as per your data) to avoid the 20k mutations limit.
To avoid manual work, you could write a Google Dataflow job which could read using SpannerIO from your table and then delete using SpannerIO.write for mutation or may be a custom PTransform (using mutatons.delete) to delete as per your requirement.