more Intel peculiarities

EDIT: Just realized I didn’t post this on the “Drivers” board like I meant to.

Sorry for all the posts lately but hopefully this information will be useful to others in the future as well.

I’ve noticed a couple other strange behaviors from Intel OpenGL drivers:

  1. We have this (pretty old) machine:

Renderer Vendor Intel
Renderer Name Intel® G45/G43 Express Chipset
Renderer Version 2.1.0 - Build 8.15.10.2869
Shading Language Version 1.20 - Intel Build 8.15.10.2869

Which exposes EXT_packed_depth_stencil, but generates an INVALID_ENUM error if GL_DEPTH24_STENCIL8_EXT is passed as the format parameter to glRenderbufferStorage. It will accept GL_DEPTH_STENCIL_EXT without complaint though.

The spec states:

“The error INVALID_ENUM is generated if RenderbufferStorageEXT is called with an <internalformat> that is not RGB, RGBA, DEPTH_COMPONENT, STENCIL_INDEX, DEPTH_STENCIL_EXT, or one of the internal formats from table 3.16 or table 2.nnn that has a base internal format of RGB, RGBA, DEPTH_COMPONENT, STENCIL_INDEX, or DEPTH_STENCIL_EXT.”

According to that table 3.16, DEPTH24_STENCIL8_EXT does have a base internal format of DEPTH_STENCIL_EXT, so is their driver incorrect in emitting this error? Even when using DEPTH_STENCIL_EXT for the renderbuffer storage I get sporadic crashes from CheckFramebufferStatus so I’m beginning to think this particular card is a lost cause anyway. I’m just curious if DEPTH_STENCIL_EXT is somehow a “more correct” parameter to pass to RenderbufferStorage than DEPTH24_STENCIL8_EXT.

  1. On another Intel machine:

Renderer Vendor Intel
Renderer Name Intel® HD Graphics 3000
Renderer Version 3.1.0 - Build 9.17.10.2932
Shading Language Version 1.40 - Intel Build 9.17.10.2932

I find that neither glUniform1uiEXT nor glUniform1ui are exported. This card doesn’t export GL_EXT_gpu_shader4, which I believe means that it doesn’t need to support glUniform1uiEXT , however it says it supports OpenGL 3.1, and (at least according to the glew header) glUniform1ui is part of OpenGL 3.0 so shouldn’t it be required to expose this function?

EDIT:

I’m finding another problem where querying the location of a uint uniform is returning a garbage location. Since this driver doesn’t have any glUniform1ui functions, could that also mean there’s no support for uint uniforms? Has there ever been any restriction on using uint uniforms in a shader since the “uint” type was added to GLSL?

I can’t help with your specific question, but I’ve also had a real battle getting older Intel HD gpus to behave.
One thing they all share is being super picky about having Scissor setup - ie binding a FBO without setting this just hangs the cards I’m using.