What I am trying to accomplish is building the simplest system to process image uploading through message queues.
Right now we are working with temporary files, we build one for each image that needs to be sent over Aws S3 and optimize/customize it depending on our needs. Once all of this is done, we push it to S3.
Now, this currently works, but there's the filesystem overhead that I would like to remove by using base64 encodings of the image, making the process entirely decoupled from the system where the application is running.
Since we are going to use Amazon SQS (we currently use Beanstalkd on production), their service does not allow for more than 256Kb of payload to be pushed over the queue, and this is a problem since images are pretty heavy.
What solutions are still available to explore?
You can look at breaking the file into suitable message sizes, 256Kb in this case, tag the messages with appropriate sequence numbers and put to queue. At the other end, receive the messages re-assemble the messages in the order and write to a file or push image data to further stages in your solution.