GL_LINEAR and GL_NEAREST

Hello,

I’ve been working on direct volume rendering using a simple ray caster and I wanted to expand it slightly to include basic (Phong) shading.

To do the light I calculate surface gradients on the fly using simple forward-backwards approximation. This is where I run into the problem.

To display the model (that is without lighting) I set the filters of the 3D texture to GL_NEAREST. This way when a ray hits the surface it returns the value of the closest point, if that turns out to be 0 it goes a little deeper and tries again.

However, to get my gradient (normal) approximation to work I need to use a GL_LINEAR filter. This way once I’m on a point I can move forward and backwards just a bit and the linear filtering will still change the sampled value enough to produce a non-zero gradient. If I use GL_NEAREST I run into problems where the small step might be too small and round to the same voxel or too big and round to a voxel two away.

I can’t use GL_LINEAR because it messes up the boundaries of the model. If a ray nearly hits a surface voxel with value 120 say, it’ll be filtered with the 0 of the empty space and you end up with a random effect across boundaries.

With GL_LINEAR
[ATTACH=CONFIG]732[/ATTACH]

With GL_NEAREST
[ATTACH=CONFIG]733[/ATTACH]

Just noticed the front surface is green instead of the blue it should be. That shows the linear problem, the surface value is 255(blue) but it gets interpolated with empty space value 0 and ends up around 128 which is green.

So I suppose the question is, is there anyway to sample the same texture using GL_LINEAR and GL_NEAREST.

I tried manually rounding the sample location to produce a nearest type response from a linear filter using:

vec3 voxCorrect = vec3((round(voxCoord.x * 256.0))/256.0, (round(voxCoord.y * 256.0))/256.0, (round(voxCoord.z * 128.0))/128.0);

It helped but it was still far from good.

Thanks for any help,
Maxwell.

If you want a texture access to return the exact value stored at a voxel despite using GL_LINEAR you can make sure that your texture coordinates fall at the center of a voxel. For example to sample the value at voxel location (ix, iy, iz) from a texture of size (sx, sy, sz) use texture coordinates ((ix+0.5)/sx), (iy+0.5)/sy, (iz+0.5)/sz)), where ix,iy,iz is the zero based integer location of the voxel.

I think you can also use sampler objects (glSamplerParameter etc.). Bind the texture to 2 different texture units and bind a sampler object configured with the other filtering mode to the second unit - although I never can remember if that is really allowed by the spec, you may want to double check.
Another option would to use glTextureView to essentially get a second texture object (that shares data storage with the original) and configure the view to use a different filtering.

That worked great, thanks a lot.

I went for the sampler objects, seems like they were pretty much made for the problem I was having.

I made a single texture object and loaded the data without any parameters. I then made two samplers one for linear and one for nearest (you can set different wraps and borders as well). Then you end up with two texture units with different parameters but sharing the same texture data.

For anyone after a sampler object explanation I found this pretty good.

Thanks again.