I want to transfer messages from AWS MSK to GCP Pub/Sub using this connector.
Uploaded connector(pubsub-group-kafka-connector-1.2.0.jar ) to the s3 bucket, created custum plugin, and want to create MKS connector with this configuration.
connector.class=com.google.pubsublite.kafka.sink.PubSubLiteSinkConnector
gcp.credentials.json="service account json converted to string"
pubsublite.project=*********
tasks.max=10
topics=my-kafka-topic
value.converter=org.apache.kafka.connect.converters.ByteArrayConverter
pubsublite.topic=moove-test
key.converter=org.apache.kafka.connect.converters.ByteArrayConverter
pubsublite.location=us-west1
I think problem with gcp.credentials.json argument, but succeed when convert value to json. Any suggestions?
That error says that gcp.credentials.json
cannot be parsed. Just remove the quote marks "
.
To summarize the approach: if you have formatted JSON file, flatten the value into a single line and post it as is. Here is some example:
gcp.credentials.json={"type":"service_account","project_id":"","private_key_id":"","private_key":"-----BEGIN PRIVATE KEY----------END PRIVATE KEY-----\n","client_email":"","client_id":"","auth_uri":"https://accounts.google.com/o/oauth2/auth","token_uri":"https://oauth2.googleapis.com/token","auth_provider_x509_cert_url":"https://www.googleapis.com/oauth2/v1/certs","client_x509_cert_url":""}
I don't think you need to escape any quote marks inside the value of JSON, unless you property file is JSON as well.