In my JS script I'm receiving string with encoded image.jpeg. The image is created with OpenCV in C++:
cv::imencode(".jpeg", rgb_img, jpeg_data);
std::string jpeg_string(jpeg_data.begin(), jpeg_data.end());
Then the string is passed through websocket. In JS I have to decode that string and use as src of img tag. I tried (ChatGPT):
// Create a Uint8Array from the binary data
var arrayBuffer = new ArrayBuffer(jpeg_string.length);
var uint8Array = new Uint8Array(arrayBuffer);
for (var i = 0; i < jpeg_string.length; i++) {
uint8Array[i] = jpeg_string.charCodeAt(i);
}
// Create a Blob object from the ArrayBuffer
var blob = new Blob([arrayBuffer], { type: 'image/jpeg' });
// Create an Object URL for the Blob
var imageDataUri = URL.createObjectURL(blob);
img.src = imageDataUri;
and (Stack Overflow):
const imageDataUri = `data:image/jpeg;base64,${jpeg_string}`;
img.src = imageDataUri;
None of this works. In the first example:
src = blob:http://0.0.0.0:7000/<random numbers, lowercase letters and dashes>
in the other one:
src = data:image/jpeg;base64,<uppercase letters and numbers>
How to display that image?
I'm working with Robot Operating System, Ubuntu 22.4 (shouldn't make a difference).
This seems to be the correct way to encode your image:
cv::imencode(".jpeg", rgb_img, jpeg_data);
auto *enc_msg = reinterpret_cast<unsigned char*>(jpeg_data.data());
std::string encoded = base64_encode(enc_msg, jpeg_data.size());
Then, you can show your image using this:
const imageDataUri = `data:image/jpeg;base64,${jpeg_string}`;
img.src = imageDataUri;