Drawing to Memory

Hi Folks,

Bare with me on this one, it’s a great source of frustration for myself.

I’ve spent some time searching for 'how-to’s on how to go about building an image using OpenGL and drawing it to memory. Just about everything I come across however uses Windows-based syntax, and/or other third party libraries (e.g. GLUT). What I’m trying to achieve is drawing an image to memory by using standard OpenGL, for the purpose of then displaying the image in a window via a bitmap. The software I’m writing is a plugin for a well-known animation package, which generally speaking, is in the form of a dialog window. The plugin is well underway.

Below is some example code of how I’m trying to go about this. It doesn’t work, but will at least show the general steps I’m trying. The displayed bitmap I get (which is ‘built’ inside the for loop line by line) is just filled with vertical lines (oddly enough always on the right-side half of the image). I’m not asking anyone to do the work for me, but if anyone in the know is able to point me in the right direction here, it would be very helpful.


void My_DrawClass::OpenGLDraw(void)
{
    // Notes: Ww and Hh are class-level variables

    GLuint MyGL;
    glViewport((GLsizei)0,(GLsizei)0,(GLsizei)Ww,(GLsizei)Hh);
    glClearColor(0.0,1.0,0.0,1.0);
    glClear(GL_COLOR_BUFFER_BIT);
    UChar *Viewport_Line = GeAllocType(UChar,Ww * 4);    // SDK object
    glDrawPixels(Ww,Hh,GL_RGBA,GL_FLOAT,Viewport_Line);
    glReadBuffer(GL_BACK);

    for(int h = 0; h <= Hh-1;)
    {		
        glReadPixels(Xx,Yy,0,h,GL_RGBA,GL_FLOAT,Viewport_Line);
        MyBitmap->SetPixelData(0,h,Ww,Viewport_Line,GL_RGBA);
        h++;
    }

    FreeUChar(Viewport_Line);
}

I would seek support from the software company itself, but they won’t provide any. Will say no more on that. Any help appreciated. Cheers,

WP.

What are you trying to do there?
I’d suggest reading
http://wiki.delphigl.com/index.php/glReadPixels
and
https://www.opengl.org/sdk/docs/man2/xhtml/glDrawPixels.xml
That code does not make sense at all, hard to guess what “the point” is.

Hi hlewin, thanks for taking a look.

I’m not able to draw directly into the dialog window as far as I’m aware, so I’m trying to use OGL independent of the plugin and SDK stuff by setting up a function that draws to memory, and then getting the memory draw and transferring it to an SDK’s bitmap object. This is what I’ve tried to show in the above example (though I probably haven’t done that very well - it was late…). So the process I’m trying to follow is: make an OpenGL context, draw into it, then go over the OGL draw pixel data and transfer it line-by-line to a bitmap that I can display in the dialog window. The displaying and drawing of the SDK bitmap object I can do already, but it’s the setting up, drawing into, and retrieving of the OGL pixel draw data I can’t seem to get working.

Of course if there’s a better way I’m all ears - but I’m not sure how else I can get an OGL display otherwise. Hope that makes a bit more sense. Cheers,

WP.

Yes, that is the part I understood. But your code:


glViewport((GLsizei)0,(GLsizei)0,(GLsizei)Ww,(GLsizei)Hh);
glClearColor(0.0,1.0,0.0,1.0);
glClear(GL_COLOR_BUFFER_BIT);

That is setup and looks okay.


UChar *Viewport_Line = GeAllocType(UChar,Ww * 4);

You are trying to allocate memory for one line of pixels I guess. That makes up for Ww44 bytes as you’re using floats later on that is 4 bytes per color-component per pixel. Each pixel has 4 components in RGBA.


glDrawPixels(Ww,Hh,GL_RGBA,GL_FLOAT,Viewport_Line);

That does not make sense. You are drawing uninitialized data to the screen and you are passing bad arguments. Viewport_Line is really just a line, so Hh must read 1 in such calls. But that does not solve the issue that Viewport_Line contains random data at that point.


glReadPixels(Xx,Yy,0,h,GL_RGBA,GL_FLOAT,Viewport_Line);

Then, after drawing the random data you want to read it in again? Not that you wouldn’t get something different as result. Copy into the screenbuffer and then copy back to pointer makes no sense. Assuming you did draw something somewhere without showing, the arguments are bad again. If you are trying reading in the backbuffer line-wise Xx should most certainly read 0, Yy should read h in that case, the next parameter Ww followed by a 1.

Try a tutorial, I’d suggest.
Time for breakfast here.

Oh - and IF using floats makes sense is an interesting question here. Most Bitmap-implementations do NOT support 4*32-bit high-definition-range Images. I’d check that in the docs.

Thanks hlewin, I’ll have to sit on this until tomorrow, but you’ve provided some useful thoughts to look into. I’ll also look into some tutorials again - I had been over a few that I thought related to this matter, but didn’t go so well with the translating of the code. I probably need to do one outside the plugin from scratch.

I have changed two sections of code as a result of one of your comments regarding the reading of OGL line data and am instead building the container around the full resolution. Not entirely necessary. But:


// From:

UChar *Viewport_Line = GeAllocType(UChar,Ww * 4);

// To:

UChar *Viewport_Line = GeAllocType(UChar,Hh * Ww * 4 * 4);

And in the for loop, have adjusted:


// This:
for(int h = 0; h <= Hh-1;)
{		
    glReadPixels(Xx,Yy,0,h,GL_RGBA,GL_FLOAT,Viewport_Line);
    MyBitmap->SetPixelData(0,h,Ww,Viewport_Line,GL_RGBA);
    h++;
}

// To:
for(int h = 0; h <= Hh-1;)
{		
    MyBitmap->SetPixelData(0,h,Ww,&Viewport_Line[h * (Ww * 4 * 4)],GL_RGBA);
    h++;
}

Re: floating point bitmaps, I think we can allocate a 32-bit bitmap using the SDK objects. I did query this on the software company’s forum a few months a go, but got no response. Looking at the docs though, they mention we have a bit depth range of 1,4,8,16,24,32,64,96 - so I nervously assumed I’d be covered.

Cheers,

WP.

You have to use the data-type


MyBitmap->SetPixelData(0,h,Ww,Viewport_Line,GL_RGBA);

expects in glReadPixels. As you are allocating Uchars this may be GL_UNSIGNED_BYTE. You are sure you want GL_FLOATS and can pass float data to SetPixelData? Then you really Need 44Ww Bytes per viewport-line. If you Need byte-data there you just Need 4*Ww Bytes per scanline BUT have to Change the glReadPixels call to use GL_UNSIGNED_BYTE . It depends on your Bitmap - class.

And not only this: You have to use a Format the windowing-System supports as well. I’m not sure if one could blit Bitmaps with anything more than 32bpp using the windows-gdi. The higher bitdepth-formats usually are for Special purposes and hence not supported very widely.

To put it another way: Make sure what MyBitmap expects to be fed with by calling SetPixelData. If it expects 32bpp RGBA-data you must call glReadPixels with glReadPixels(…, GL_RGBA, GL_UNSIGNED_BYTE, viewport_line); as last Parameters and Need only to allocate 4*Ww Bytes for the viewport-line.

Another Thing: What are Xx and Yy? If you want a 1:1 copy of the back-buffer in the Bitmap Xx should most certainly read 0 and Yy should be equal to h - as this are the Parameters you use in SetPixelData of your Bitmap.
Try


glReadPixels(0, h, Ww, 1, GL_RGBA, GL_UNSIGNED_BYTE, Viewport_Line);
MyBitmap->SetPixelData(0,h,Ww,Viewport_Line, GL_RGBA);

and DO only allocate Ww*4 Bytes for Viewport_Line.
This seems most likely to be what you want.