Home > Development > Creating Render-to-Texture Secondary Framebuffer Objects on iOS Using OpenGL ES 2

Creating Render-to-Texture Secondary Framebuffer Objects on iOS Using OpenGL ES 2

I’ve come to greatly respect OpenGL and for which it stands as I discussed in my prior post.  OpenGL was recently added to our awareness for a project for work at a time I was dabbling with it at home.  These parallel lines of development brought excitement and mutual benefits for each locale.   The project I am working on at work depends on what is inherently gaming technology even though you would think so to look at it.

I needed the ability to render the scene to a secondary framebuffer object (or FBO) due to the way the application incrementally draws a picture over time.  Successive scenes merely draw the remaining portions whilst being merged with previous fragments rendered in the framebuffer.   Think of the way Google Earth draws the image.

Anyway, the code worked fine on OpenGL 3.3 but for some reason the FBO failed the completeness test on OpenGL ES 2 under iOS.  After much banging of head-to-wall I found the reason was due to the fact I was creating the secondary FBO ultimately during GLKViewController’s viewDidLoad call.  This puzzled me as I already have a OpenGL context at this time due to a:

// We want OpenGL ES 2 thank-you very much
self.context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];

It seems creating a secondary framebuffer will fail if the size differs to the main framebuffer size.  The best way to create secondary framebuffers is to wait for a viewWillLayoutSubviews notification.  By this time your GLKit view is already resized so all you need to do is query the new size and create an appropriate FBO.

The above controller calls into the following view for creating the framebuffer

To draw, you must:

  1. Have the controller call into your view
  2. Bind to your secondary FBO
  3. Draw
  4. Return to the controller
  5. Have the controller rebind to the main FBO
  6. Call into your view again to draw the offscreen texture
    The final step is to draw the screen quad (steps 5-6 above)

You might be wondering what all the bindDrawable is about.  Well normally in OpenGL you would glBindFrameBuffer(…) to your framebuffer then perform a glBindFramebuffer (…, 0) to reset to the default framebuffer – well that won’t work on OpenGL ES (well you could perhaps if you knew the actual handle).  You must use bindDrawable() instead.

Here are some books that helped me greatly

OpenGL® SuperBible: Comprehensive Tutorial and Reference (4th Edition)

OpenGL® ES 2.0 Programming Guide

Categories: Development Tags: , ,
  1. 2013/12/31 at 12:48 pm

    i’m so glad you posted this information. I’ve been trying to figure out what was going on for days and was in the process of opening a case with Apple to get some help resolving what was happening to me. I couldn’t figure it out, but now it all makes sense. I couldn’t find anything online including Apple’s online info, Stackoverflow, etc.
    Thanks for taking the time to share!

    • MickyD
      2013/12/31 at 12:52 pm

      Thanks, anytime. Enjoy the rest of opengl :)

  2. leeoxiang
    2014/07/10 at 4:35 pm

    hi MickyD

    i am working on a GLView record app, i need to get the texture data and record that to a video.

    Do i need to create offscreen framebuffer and draw it and get the texture?

    and do i need do draw the same thing with main framebuffer to get the texture?

    what i knew is that i can bind texture to the main framebuffer and then i can get the texture, is that correct?

    i am new to opengles, can you give me more information detailedly? i am thankful for that.

    • MickyD
      2014/07/10 at 6:36 pm

      Hi Lee’, from memory you just create an offscreen buffer; bind it; and render to that, you don’t need to render to the main buffer in order to read the texture. This is because on the iPad there is no difference between RAM and GPU RAM – it’s unified which is rather nice. It’s fast too, normally doing this sort of thing on the desktop would be incredibly slow.

    • MickyD
      2014/07/11 at 2:41 pm

      I gave you an upvote :) . I put some pointers there too, not sure how useful there are as I don’t know much about the video APIs

  3. 2015/02/11 at 10:53 pm

    Hi Micky,

    Can you please tell us the values of
    glVertexAttribPointer (ATTRIBUTE_BASIC_POSITION, 3, GL_FLOAT, GL_FALSE, 5 * sizeof(GLfloat), _screenQuadVertices );
    glVertexAttribPointer (ATTRIBUTE_BASIC_TEX_COORD, 2, GL_FLOAT, GL_FALSE, 5 * sizeof(GLfloat), &_screenQuadVertices[3] );
    glEnableVertexAttribArray (ATTRIBUTE_BASIC_POSITION);
    glEnableVertexAttribArray (ATTRIBUTE_BASIC_TEX_COORD);

    for a full screen texture of iPhone? I am sorry but i am very new to this. I have created app in which i am doing exactly the same, but the off screen frame buffer is drawing perfectly(i saved as an image) but the same texture is not getting rendered. ican send you the code as well.

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 72 other followers