Enabling a vertex attrib array that the shader doesn't use.

Greetings!

To clarify a bit, I’m talking about the following scenario:

  1. You have a VBO with all of the vertex data interlaced within it (a,b,c,d,a,b,c,d,…)
  2. Your shader program uses attributes a, b, and c.
  3. Your shader program does NOT use attribute d.
  4. You call glBindAttribLocation on your program for attributes a-d, giving them indices 0-3, in order.
  5. You link the shader.
  6. You set up a VAO, and in doing so, you enable set the pointers correctly for attributes 0-3, and you also enable attribute arrays 0-3.

The questions I have concern steps 4 and 6.

First question: Does attempting to bind an attribute location for an attribute that the shader doesn’t use have any negative consequences?

Second question: will enabling a vertex attribute array for an attribute that your shader doesn’t use cause any problems? For instance, will there be an error? Will there be a performance penalty for doing so?

I’ve tried doing both of these things, and at least with the drivers I’m using, it works fine. I wasn’t able to glean any information from the spec about what happens in these cases, so I’m hoping that someone here will have some information.

Thanks!
Frank

I’ve seen this crash - and crash hard - on AMD. The scenario was similar; data consisted of 3 attribs (vec3, vec2, vec2), drawn with 3 programs. 2 programs used all 3 attribs, 1 program only used the vec3, having attrib arrays enabled for the two vec2s caused the crash.

That covers your point 6, and I need to note that it may be fixed with more recent drivers. If you’re currently using AMD and you don’t have problems then you can safely assume that it is fixed, but if you’re planning on ever releasing this program, you need to be aware that your users may not have updated drivers.

I can’t comment on your point 4 since I’ve been using “layout (location=” syntax ever since it became available, so glBindAttribLocation is something I haven’t had to use in a good few years.

Ok thanks! I’ll be more careful about it then. I was just hoping that I wouldn’t have to change which arrays are enabled or disabled based on what shader I’m using. Oh well.