Search code examples
xcodeipadopengl-esscreen-grab

Export Opengl ES video


XCode has the ability to capture Opengl ES frames from the iPad, and that's great! I would like to extend this functionality and capture an entire Opengl ES movie of my application. Is there a way for that? if it's not possible using XCode, how can i do it without much effort and big changes on my code? thank you very much!


Solution

  • I use a very simple technique, which requires just a few lines of code.

    You can capture each OGL frame into UIImage using this code:

    - (UIImage*)captureScreen {
    
        NSInteger dataLength = framebufferWidth * framebufferHeight * 4;
    
        // Allocate array.
        GLuint *buffer = (GLuint *) malloc(dataLength);
        GLuint *resultsBuffer = (GLuint *)malloc(dataLength);
        // Read data
        glReadPixels(0, 0, framebufferWidth, framebufferHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
    
        // Flip vertical
        for(int y = 0; y < framebufferHeight; y++) {
            for(int x = 0; x < framebufferWidth; x++) {
                resultsBuffer[x + y * framebufferWidth] = buffer[x + (framebufferHeight - 1 - y) * framebufferWidth];
            }
        }
    
        free(buffer);
    
        // make data provider with data.
        CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, resultsBuffer, dataLength, releaseScreenshotData);
    
        // prep the ingredients
        const int bitsPerComponent = 8;
        const int bitsPerPixel = 4 * bitsPerComponent;
        const int bytesPerRow = 4 * framebufferWidth;
        CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
        CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
        CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
    
        // make the cgimage
        CGImageRef imageRef = CGImageCreate(framebufferWidth, framebufferHeight, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
        CGColorSpaceRelease(colorSpaceRef);
        CGDataProviderRelease(provider);
    
        // then make the UIImage from that
        UIImage *image = [UIImage imageWithCGImage:imageRef];
        CGImageRelease(imageRef);
    
        return image;
    }
    

    Then you will capture each frame in your main loop:

    - (void)onTimer {
    
        // Compute and render new frame
        [self update];
    
        // Recording
        if (recordingMode == RecordingModeMovie) {
    
            recordingFrameNum++;
    
            // Save frame
            UIImage *image = [self captureScreen];
            NSString *fileName = [NSString stringWithFormat:@"%d.jpg", (int)recordingFrameNum];
            [UIImageJPEGRepresentation(image, 1.0) writeToFile:[basePath stringByAppendingPathComponent:fileName] atomically:NO];
        }
    }
    

    At the end you will have tons of JPEG files which can be easily converted into a movie by Time Lapse Assembler

    If you want to have nice 30FPS movie, hard fix your calc steps to 1 / 30.0 sec per frame.