Search code examples
c#pythonsocketsstreamjpeg

JPG encoding in python sent via socket to C# and decode


I want to sent my processed image from python backend to C# frontend. So i encode it using JPG encoder and sent.

When i receive it and decoding it using jpegdecoder in C#, it throw the exception. Where size of received bytes are equal to the bytes sent after encoding.

Could anyone guide me how i decode this image and display it.

code in Python Server Side encoding Image JPG

while True:
    conn, addr = s.accept()
    print ('connected to : ' + addr[0] +" :"+ str(addr[1])) 

    vidcap = cv2.VideoCapture(videoPath)
    total_frames = int(vidcap.get(cv2.CAP_PROP_FRAME_COUNT))
    conn.setblocking(0)
    for mm in range(0,total_frames ,1):
       try:
           dataClient = str(conn.recv(4096).decode('UTF-8'))
           conn.send(str.encode(dataClient))   
           conn.close()
           kk=2
           break
       except socket.error:

           ret,img= vidcap.read()
           image = cv2.resize(img, (640, 480)) 
           #############################################
           encode_param=[int(cv2.IMWRITE_JPEG_QUALITY),100]
           result,enData=cv2.imencode('.jpg',image,encode_param)
           conn.send(enData)
           #############################################

Code in C# Client side decoding Image JPG

            const int PORT_NO = 6666;
            //const string SERVER_IP = "210.107.232.138"; // 210.107.232.138          127.0.0.1
            string SERVER_IP = IpAddress.Text;
            client = new TcpClient(SERVER_IP, PORT_NO);
            nwStream = client.GetStream();
            bytesToRead = new byte[client.ReceiveBufferSize];
            bytesRead = nwStream.Read(bytesToRead, 0, client.ReceiveBufferSize);
            /////////////////////////////////////
            MemoryStream ms = new MemoryStream(bytesRead);
            JpegBitmapDecoder decoder = new JpegBitmapDecoder(ms, BitmapCreateOptions.PreservePixelFormat, BitmapCacheOption.Default); //line 129
            BitmapSource bitmapSource = decoder.Frames[0];
            ///////////////////////////////////// 
            var src = new System.Windows.Media.Imaging.FormatConvertedBitmap();
            src.BeginInit();
            src.Source = bitmapSource;
            src.DestinationFormat = System.Windows.Media.PixelFormats.Bgra32;
            src.EndInit();

            //copy to bitmap
            Bitmap bitmap = new Bitmap(src.PixelWidth, src.PixelHeight, System.Drawing.Imaging.PixelFormat.Format32bppArgb);
            var data = bitmap.LockBits(new Rectangle(Point.Empty, bitmap.Size), System.Drawing.Imaging.ImageLockMode.WriteOnly, System.Drawing.Imaging.PixelFormat.Format32bppArgb);
            src.CopyPixels(System.Windows.Int32Rect.Empty, data.Scan0, data.Height * data.Stride, data.Stride);
            bitmap.UnlockBits(data);

            pictureBox2.Image = bitmap;

Exception


Solution

  • MemoryStream ms = new MemoryStream(bytesRead); // bytesRead = int, number of bytes received
    

    This line creates an empty MemoryStream with an initial capacity of bytesRead bytes. See MemoryStream(int)

    You need to use another constructor to also fill it with the received data:

    MemoryStream ms = new MemoryStream(bytesToRead); // byte[], received data
    

    See MemoryStream(Byte[])


    Please also mind john's comment on your receiving code:

    There's no guarantee you will receive the full buffer of data in a single receive event.