Search code examples
androidimage-processingyuvopenframeworks

VideoGrabber OF_PIXELS_RGB rendering incorrectly on Android


Why when I draw an ofImage from a ofxAndroidVideoGrabber which is set to OF_PIXELS_RGB does it look like this? screenshot of weird color encoding

Here is my code:

void ofApp::setup(){
    ofBackground(0,0,0);

    grabber.setPixelFormat(OF_PIXELS_RGB);

    // Start the grabber
    grabber.setup(640,480);

    // Get the native android video grabber
    ofxAndroidVideoGrabber* androidGrabber = (ofxAndroidVideoGrabber*)grabber.getGrabber().get();

    // Ensure facing the correct direction
    androidGrabber->setDeviceID(androidGrabber->getBackCamera());
}

void ofApp::update(){
    grabber.update();
    if(grabber.isFrameNew()){
        // Add frame
        ofPixels nextFrame = grabber.getPixels();
        frameQueue.push(nextFrame);
    }
}

void ofApp::draw(){    
    // Calculate aspect ratio of grabber image
    float grabberAspectRatio = grabber.getWidth() / grabber.getHeight();

    // Draw camera image centered in the window
    ofPushMatrix();
    ofSetHexColor(0xFFFFFF);
    ofSetRectMode(OF_RECTMODE_CENTER);

    ofTranslate(ofGetWidth() / 2, ofGetHeight() / 2);

    // Initial check that the next frame should be rendered
    if ( ! frameQueue.empty() ) {
        ofPixels nextFrame = frameQueue.front();
        grabberImage.setFromPixels(nextFrame);
        grabberImage.mirror(false, true);
        grabberImage.update();

        // Advance
        frameQueue.pop();
    }


    if ( grabberImage.isAllocated() ) {
        if(ofGetWidth() > ofGetHeight()) {
            grabberImage.draw(0, 0, ofGetHeight() * grabberAspectRatio, ofGetHeight());
        } else {
            grabberImage.draw(0, 0, ofGetWidth(), ofGetWidth() * 1.0/grabberAspectRatio);
        }
    }

    ofPopMatrix();
    ofSetRectMode(OF_RECTMODE_CORNER);
}

To my knowledge, video comes in as NV21 and is converted to RGB by openframeworks. My best guess is that:

  1. I'm doing something wrong that skips that conversion
  2. There is a bug in oF.

I've tried the provided camera example and I get the same affect.


Solution

  • Turns out that it was that my emulator used a different format than most android devices for it's video feed.

    The underlying problem was that openframeworks assumed that the U and V channels for each pixel were next to each other or semiplanar. But instead based on looking at the android source I found that a YUV image can be either YUV420Planar or YUV420SemiPlanar. If the image is YUV420Planar, then the V channel follows the entire U channel. IE, the index of the first pixel's V channel is startIndexU + width * height. Most android devices I have tested are semiplanar so if you see a video like mine, it's probably just that your emulator is giving a planar feed instead of a semiplanar feed.