There is a clockhouse database in which there are billions of rows of data an example of 200 million rows needs to be transferred and transferred to another physical server.
Writing a python script that will execute the necessary select request, save it to csv, and then transfer the file to another server via ftp or in some other way. Something tells me this is a bad idea.
Use the remoteSecure
function: https://clickhouse.com/docs/en/sql-reference/table-functions/remote
There are some examples here: https://clickhouse.com/docs/en/cloud/migration/clickhouse-to-cloud#migration-of-tables-from-one-system-to-another