i have a JSON payload of 5 MB which i need to push to Kinesis streams using put records. Since the Kinesis data size limit is 1 MB, which methods i should follow to compress the data and what will be the steps
If your json payload is still too big after compression, then you have generally two options:
Split it into multiple smaller payloads. The consumers would have to be able to reconstruct the payloads based on a part id
of your payload.
Store the large payload data outside of the stream, e.g. in S3, and just send metadata of the large file (e.g. s3 path) in the messages.
Which compression to use is dependent on your stream producer. More specifically, which compression algorithms do they support.
But ultimately if either of the two options is not suited for you, then you may need to consider that the Kinesis is not the right tool for the job. I think Apache Kafka can support larger messages than 1MB.