Search code examples
iphoneioscocoa-touchvideo-capture

How do I write captured frame into socket?


In my app, I am capturing media as frames using

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection 

delegate, I have recorded and played the recorded file using MPMoviePlayerController. Now my question is, I want to write those buffer into a socket and send to server instead of writing into a file. What kind of changes do I need to make?

thanks for your helps.


Solution

  • It seemed your question contained two parts:

    1. To capture frame by frame, I used this snippet from Apple: (put it inside your mentioned captureOutput method:

      CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
      CVPixelBufferLockBaseAddress(imageBuffer, 0);
      uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
      size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
      size_t width = CVPixelBufferGetWidth(imageBuffer);
      size_t height = CVPixelBufferGetHeight(imageBuffer);
      
      CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
      CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
      CGImageRef newImage = CGBitmapContextCreateImage(newContext);
      
      CGContextRelease(newContext);
      CGColorSpaceRelease(colorSpace);
      
      //   NSImage *image = [UIImage imageWithCGImage:newImage];
      //   self.imgData = UIImageJPEGRepresentation(image , 1.0);  //convert to NSData to send
      CGImageRelease(newImage);
      CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
      
    2. For network socket communication, you can use CocoaAsyncSocket. If you don't need socket level, you can also use NSStream.