What is the expected result for unsigned texture having isampler2D in shaders?

I am writing an application to test isampler2D, thus have created a integer texture as:


GLubyte *lpTex = NULL;

 for (j=0; j<32; j++) {
        for (i=0; i<32; i++) {
             if ((i ^ j) & 0x8) {
                 lpTex[0] = 56;
                 lpTex[1] = lpTex[2] = lpTex[3] = 0;
             } else {
                 lpTex[0] = lpTex[1] = lpTex[2] = lpTex[3] = 127;
             }
             lpTex += 4;
         }
     }

 glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 32, 32, 0, GL_RGBA, GL_UNISIGNED_BYTE, lpTex);

Fragment shader as :


in  vec2 texcoord;
out ivec4 fragcolor;

uniform isampler2D basetexture;

void main(void)
{
	ivec4 texlookup = texture(basetexture, texcoord);
	fragcolor = texlookup;
}

And its giving the correct result.
Shouldn’t that give error or invalid output as I am using unsigned texture for isampler2D or is it correct?
Can anyone please explain it.

According to my interpretation, we should always use integer texture for isampler2D and unsigned texture for usampler2D.
Thanks.

The expected result is undefined behavior. Thus, it may “work”, but you cannot rely on it.

First, you need to create your texture with:


glTexImage(GL_TEXTURE_2D, 0, GL_RGBA8UI, 32, 32, 0, GL_RGBA_INTEGER, GL_UNSIGNED_BYTE, lpTex);

to create an integer texture, otherwise it interprets the 8bit unsigned texture as fixed point (0…1). The GL_UNSIGNED_BYTE enum simply tells glTexImage() what data format you are passing the data in. It’s the GL_RGBA8UI enum that specifies the actual format the GPU will use.

As long as the sampler variable and variable you’re assigning the texture() return value to are both signed or both unsigned, the GLSL shader will compile. There is no GL_RGBA_INTEGER_UNSIGNED enum, so the above texture can either be accessed as a signed or unsigned texture. It’s up to you to match the texture with the correct sampler type.

First, you need to create your texture with

He knows that. He knows he’s created a texture that uses normalized integers. He’s asking why his code seems to be working despite this. Also:

There is no GL_RGBA_INTEGER_UNSIGNED enum, so the above texture can either be accessed as a signed or unsigned texture.

No, it cannot. If you use GL_RGBA8UI with an isampler2D, you will get undefined behavior. The spec is very clear on this: you must match signed samplers with signed formats and unsigned samplers with unsigned formats.

He knows that. He knows he’s created a texture that uses normalized integers. He’s asking why his code seems to be working despite this. Also:

I’m not so sure. His first sentence, “I am writing an application to test isampler2D, thus have created a integer texture as:” is followed by code that does not create an integer texture.

No, it cannot. If you use GL_RGBA8UI with an isampler2D, you will get undefined behavior.

Interesting, I wouldn’t expect the spec to be so stringent, but I suppose better safe than sorry. I wonder if validateProgram() picks that up.

I am writing an application to test isampler2D, thus have created a integer texture as:

Sorry for the above confusion. I have tested the code with integer texture and its working fine. But same thing is working with unsigned texture, created as above , having isampler2D in shader which is confusing me.

From the above comments I understood that we can use unsigned texture with isampler2D , but the result is undefined. Is it correct?
Thanks.

I understood that we can use unsigned texture with isampler2D , but the result is undefined. Is it correct?

Sort of. To say “can use” is wrong, because invoking undefined behavior is not a thing that you should be doing. OpenGL will not error; it will let you try, and it will return some value. But you shouldn’t do it.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.