Search code examples
iosobjective-cipadopengl-esios6

Taking a screenshot of a third-party CCGLView


I have a third party game object in my iOS6 app which I have no access to its implementation, and now I need to take a screenshot of its view.

The game object has a public method that allows me to add it to my view:

-initWithParentView:andController:

and I managed to grab the view out by:

 self.gameview = [self.myView.subviews firstObject]; //there is only one subview

The self.gameview is of type CCGLView. So I extend the CCGLView to add the snapshot methods as suggested in Apple Doc (http://nathanmock.com/files/com.apple.adc.documentation.AppleiOS6.0.iOSLibrary.docset/Contents/Resources/Documents/#qa/qa1704/_index.html)

- (UIImage*)snapshot
{
// Get the size of the backing CAEAGLLayer
GLint backingWidth, backingHeight;
glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);

NSInteger x = 0;
NSInteger y = 0;
NSInteger width = backingWidth;
NSInteger height = backingHeight;
NSInteger dataLength = width * height * 4;
GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));

// Read pixel data from the framebuffer
glPixelStorei(GL_PACK_ALIGNMENT, 4);
glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data);

// Create a CGImage with the pixel data
// If your OpenGL ES content is opaque, use kCGImageAlphaNoneSkipLast to ignore the alpha channel
// otherwise, use kCGImageAlphaPremultipliedLast
CGDataProviderRef ref           = CGDataProviderCreateWithData(NULL, data, dataLength, NULL);
CGColorSpaceRef colorspace      = CGColorSpaceCreateDeviceRGB();
CGImageRef iref                 = CGImageCreate(
                                                width,
                                                height,
                                                8,
                                                32,
                                                width * 4,
                                                colorspace,
                                                kCGBitmapByteOrderDefault | kCGImageAlphaPremultipliedLast,
                                                ref, NULL, true, kCGRenderingIntentDefault);



// OpenGL ES measures data in PIXELS
// Create a graphics context with the target size measured in POINTS
NSInteger widthInPoints;
NSInteger heightInPoints;
if (NULL != UIGraphicsBeginImageContextWithOptions) {
    // On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration
    // Set the scale parameter to your OpenGL ES view's contentScaleFactor
    // so that you get a high-resolution snapshot when its value is greater than 1.0
    CGFloat scale       = self.contentScaleFactor;
    widthInPoints       = width / scale;
    heightInPoints      = height / scale;
    UIGraphicsBeginImageContextWithOptions(CGSizeMake(widthInPoints, heightInPoints), NO, scale);
}
else {
    // On iOS prior to 4, fall back to use UIGraphicsBeginImageContext
    widthInPoints       = width;
    heightInPoints      = height;
    UIGraphicsBeginImageContext(CGSizeMake(widthInPoints, heightInPoints));
}

CGContextRef cgcontext  = UIGraphicsGetCurrentContext();

// UIKit coordinate system is upside down to GL/Quartz coordinate system
// Flip the CGImage by rendering it to the flipped bitmap context
// The size of the destination area is measured in POINTS
CGContextSetBlendMode(cgcontext, kCGBlendModeCopy);
CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref);

// Retrieve the UIImage from the current context
UIImage *image          = UIGraphicsGetImageFromCurrentImageContext();

UIGraphicsEndImageContext();

UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil );

// Clean up
free(data);
CFRelease(ref);
CFRelease(colorspace);
CGImageRelease(iref);

return image;
}

And I also handle the frame buffer as demonstrated in Apple's EAGLView library:

- (void)setFramebuffer
{
if (self.context)
{
    [EAGLContext setCurrentContext:self.context];

    if (!defaultFramebuffer)
        [self createFramebuffer];

    glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);

    glViewport(0, 0, framebufferWidth, framebufferHeight);
}
}

- (void)createFramebuffer
{
if (self.context && !defaultFramebuffer)
{
    [EAGLContext setCurrentContext:self.context];

    // Create default framebuffer object.
    glGenFramebuffers(1, &defaultFramebuffer);
    glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);

    NSUInteger result = glCheckFramebufferStatus(GL_FRAMEBUFFER);
    NSLog(@"%d",result); //this returns value 36055

    // Create color render buffer and allocate backing store.
    glGenRenderbuffers(1, &colorRenderbuffer);
    glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);

    CAEAGLLayer *glLayer = (CAEAGLLayer*) self.layer;
    glLayer.drawableProperties = @{[NSNumber numberWithBool:NO]: kEAGLDrawablePropertyRetainedBacking,
                                   kEAGLColorFormatRGBA8: kEAGLDrawablePropertyColorFormat};
    glLayer.opaque = YES;

    [self.context renderbufferStorage:GL_RENDERBUFFER fromDrawable:(CAEAGLLayer *)self.layer];
    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &framebufferWidth);
    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &framebufferHeight);

    glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, colorRenderbuffer);

    if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE)
        NSLog(@"Failed to make complete framebuffer object %x", glCheckFramebufferStatus(GL_FRAMEBUFFER));
}
 }

- (BOOL)presentFramebuffer
{
BOOL success = FALSE;

if (self.context)
{
    [EAGLContext setCurrentContext:self.context];

    glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);

    success = [self.context presentRenderbuffer:GL_RENDERBUFFER];
}

return success;
}

When it's all set, I do the following to call my snapshot function:

[self.gameview setFramebuffer];
UIImage *image = [self.gameview snapshot];

[self.gameview presentFramebuffer];

return image;

and all I get is a white image.

I am particularly feeling doubtful with the createFrameBuffer method because I am not initializing the openGL view myself. I tried to grab the current frame buffer from the view but have no luck using: glGetIntegerv(GL_FRAMEBUFFER, &defaultFramebuffer);

One thing worth mention is, I did manage to capture a part of the screen whenever I tried to draw something before I call the snapshot function using glDrawArrays. I can't make it work completely though no matter how I tried to alter the array value. (I am not very familiar with openGL).

Would appreciate it if anyone can point out anything I have missed here.


Solution

  • I found the solution. Turned out that I've complicated the whole thing with openGL. The secret is in CCRenderTexture of cocos2d.

    #include <CCRenderTexture.h>
    #include <CCScene.h>
    
    CCScene *scene = [[CCDirector sharedDirector] runningScene];
    CCNode *n = [scene.children objectAtIndex:0];
    
    [CCDirector sharedDirector].nextDeltaTimeZero = YES;
    CGSize winSize = [CCDirector sharedDirector].winSize;
    CCRenderTexture *rtx = [CCRenderTexture renderTextureWithWidth:winSize.width height:winSize.height];
    [rtx begin];
    [n visit];
    [rtx end];
    
    UIImage *image = [rtx getUIImage];
    
    return image;
    

    Code works perfectly in iOS6 and iOS7.