Search code examples
androidandroid-ndkstreamingstagefright

stagefright CameraSource::read returns small frames (20 bytes) on Samsung Galaxy S2


I am trying to grab the frames captured by the camera, encode them and finally send them using RTP/RTSP.

To do the capturing I am using the CameraSource class of stagefright. The preview on the screen (Surface passed from Java) is great. But when I try to extract the frames I get frames of 20 bytes.

What am I doing wrong ?

Size videoSize;
videoSize.width = 352;
videoSize.height = 288;
sp<CameraSource> myCamera = CameraSource::CreateFromCamera(NULL, NULL, 
                             1 /*front camera*/, videoSize, 25, mySurface, true);
myCamera->start();

//the following is from a reader thread.
status_t err = OK;
MediaBuffer* pBuffer;
while ((err = myCamera->read(&pBuffer)) == OK)
{
    // if not getting a valid buffer from source, then exit
    if (pBuffer == NULL)
    {
        return;
    }
    else
    {
        LOGD("The Size of the returned buffer is: %d", pBuffer->size() );
    }
    pBuffer->release();
    pBuffer = NULL;
}

Solution

  • You are doing everything correctly, but Samsung decided not to support the route you tried to implement. The only way to use CameraSource on Galaxy S2 (and many other Samsung devices) is to connect it directly to the hardware encoder.