glTexturePageCommitmentEXT problem

Hi all,
I’m wrapping my head around the sparse bindless textures with a little bit of DSA (Direct State Access). There seems to be a problem with glTexturePageCommitmentEXT, which is DSA variant of glTexPageCommitmentARB. First of all it is defined:


void TexturePageCommitmentEXT(uint texture,
                              int level,
                              int xoffset,
                              int yoffset,
                              int zoffset,
                              sizei width,
                              sizei height,
                              sizei depth,
                              boolean commit);

Which seems inconsistent with other DSA functions and of course in glew headers v 1.10 there is this definition (secong argument is a target)

typedef void (GLAPIENTRY * PFNGLTEXTUREPAGECOMMITMENTEXTPROC) (GLuint texture, GLenum target, GLint level, GLint xoffset, GLint yoffset, GLint zoffset, GLsizei width, GLsizei height, GLsizei depth, GLboolean commit);

Which makes sense. But the problem is that on nV card (650 ti, v334.89) there is invalid value error after the call of this function, yet the commitment works fine and I’m able to draw the texture. Same with AMD/ATI card (7950 , 14.4 cat.), there is invalid value error but no texture commitment, thus no texture drawn.

I have tried to bind my own proc with the one less parameter and used it. No error generated on both cards but I got no comprehensive image or none at all.

The other variant of the function (that needs binding) works fine, but I would like to use the DSA one.

Here is a sample snipet of the code (working only on nV but with invalid value error):


   GLuint sparsetex2D = 0;
   glGenTextures(1,&sparsetex2D);   
   glBindTexture(GL_TEXTURE_2D, sparsetex2D);
   glTextureParameteriEXT(sparsetex2D, GL_TEXTURE_2D, GL_TEXTURE_SPARSE_ARB, GL_TRUE);
   getPageSizesforFormat(GL_TEXTURE_2D, GL_RGBA8,&pageSizes);
   glTextureStorage2DEXT(sparsetex2D, GL_TEXTURE_2D, 10, GL_RGBA8 , img1->s(),img1->t());
   glTextureParameteriEXT(sparsetex2D, GL_TEXTURE_2D, GL_VIRTUAL_PAGE_SIZE_INDEX_ARB, 0);
   sparseHandle = glGetTextureHandleARB(sparsetex2D);
   glMakeTextureHandleResidentARB(sparseHandle);
   
   glTexturePageCommitmentEXT(sparsetex2D, GL_TEXTURE_2D, 0, 0,0,0, img1->s() ,img1->t() , 1, GL_TRUE);
   
   glTextureSubImage2DEXT(sparsetex2D, GL_TEXTURE_2D, 0, 0,0,img1->s(),img1->t(), img1->getInternalTextureFormat(), img1->getDataType(), img1->data());
   

Thanks.

Ok, it seems that the problem is in Glew. After fixing some other issue, the correctly bound extension function (by hand according to the spec) works on both cards.