Intel driver version 10.18.3910 running on an i7 363QM (HD4000), OpenGL 4.0 supported.
This code:
uint a = 0xFFFFFFFF;
uint b = -1;
if(a != b) {
return vec3(0,0,1);
}
else {
return vec3(1,0,0);
}
actually returns vec3(0,0,1). My world has been shattered.
In all seriousness, this seems like a nasty bug on a basic functionality. I checked the GLSL4.0 spec to make sure this made sense and it does explain all int and uints are 32 bits, with ints being two’s complemente ( https://www.opengl.org/registry/doc/GLSLangSpec.4.00.7.pdf ), so this inequality should never happen.
One would think the problem lies on the -1, but it seems to actually be on the 0x value. Passing a uniform uint with value 0xFFFFFFFF fails the equality check against the 0x constant while succeeds against the -1.
EDIT:
Yeah, this is broken.
uint a = 0xFFFFFFFF;
uint b = 0xFFFFFFFE;
b += 1;
if(a != b) {
return vec3(0,0,1);
}
else {
return vec3(1,0,0);
}
Returns vec3(0,0,1);
I suppose this is some kind of informal bug report, but I’m pretty sure Intel reads these forums as much as they update their GL drivers.