ATI Bug with GL_EXT_texture_sRGB?

We’re using GL_EXT_texture_sRGB in a Photoshop plugin and seeing a very strange issue on an ATI 4670 system, which I can only think must be a bug…

We load a texture of modest size using the requested internal format of GL_SRGB_ALPHA. For example, 800 x 700 pixels, which is well below the max texture size OpenGL reports for GL_MAX_TEXTURE_SIZE (== 8192 x 8192).

We’ve found that only when the texture size is larger than the size of the window from which the device context was originally extracted, OpenGL gives us an internal format of GL_RGBA16F, which loses the sRGB processing and leaves our image looking too bright, instead of being properly handled as sRGB via internal format GL_SRGB_ALPHA, which we get successfully with small textures. The current window or viewport size doesn’t matter; the cutoff remains the initial window size.

We tried requesting format GL_SRGB8_ALPHA8, which looks to be identical in function to GL_SRGB_ALPHA, but the problem remains.

Additionally, when we run our software as a plugin to any editor other than Photoshop, the problem goes away. This made us think for a while that we must have a bad pointer or something, but we’ve checked everything out very thoroughly and are now confident that we do not.

The software works fine under the exact same conditions with a nVidia board.

Is specifying GL_SRGB_ALPHA for the desired internal format the best (only) way to get sRGB texture functionality?

We’re looking at making some trial runs during initialization using glTexImage2D(GL_PROXY_TEXTURE_2D… and reading back the internal format, and reducing the texture size until we get the internal format we expect as a generalized solution, in case this kind of bug exists in another form on another system. Does this sound reasonable?

This has me rather worried that the OpenGL realm is full of unexpected bugs, in drivers for cards we don’t have, and we really have no additional funding for our test lab (which has representative video cards from the major vendors, but is by no means exhaustive). How do you handle ensuring your software runs properly on a wide variety of systems? Big beta test groups? Contracting well-funded labs for testing?

Thanks for any insights you’re willing to share.

-Noel

Additionally, when we run our software as a plugin to any editor other than Photoshop, the problem goes away.

Wow. Bugs in OpenGL implementations, I can understand. But unless Photoshop itself is creating that texture, the texture creation functions shouldn’t be different in different applications.

Unless the driver is recognizing Photoshop and using special code for it. Being a popular application, that is not uncommon.

And what ATI driver version are you on?

there is indeed a glitch here. we do detect some internal texture used by photoshop inside the driver in order to improve the quality of the output for 10bit monitor, and change its internal format to rgba16f.

unfortunately, it seems that your texture is being incorrectly detected by our driver :frowning: we will tighten this _enhancement in a future driver.

Thanks, guys. I appreciate the responses.

I’m on Catalyst 10.7 (and I verified it in 10.6 as well). My system and monitor are set to sRGB and I am set to 32 bit color (not sure whether that is the same as the “10 bit monitor” you mentioned).

If it helps, Pierre, we run as a plugin to Photoshop, and a plugin is a DLL, which essentially means we’re running as part of Photoshop, though we’re doing the OpenGL rendering in our own thread we created. If you’d like to contact me for a prototype that reproduces the problem we can arrange something.

NCarboni@ProDigitalSoftware.com

I think we may be in uncommon territory, making OpenGL based plugins, but I’m sure more will be doing so in the future.

For now we have a workaround in that we can break up the image and display it in small pieces (up to 8192 x 512 seems to de-trigger your _enhancement).

-Noel

Pierre, I have sent you test software. Please let me know if you do not get the eMail.

-Noel