Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 2 of 2 FirstFirst 12
Results 11 to 14 of 14

Thread: Edge texture filtering

  1. #11
    Advanced Member Frequent Contributor
    Join Date
    Apr 2003
    Posts
    661
    Also be aware that the layer selection (that would be the mipmap selection for a real texture) the way you do it is only suited for the bilinear filter. If you want to have 'real' anisotropic filtering, you have to use the right math as well. Consult the EXT_texture_filter_anisotropic specs for that.

  2. #12
    Senior Member OpenGL Pro Aleksandar's Avatar
    Join Date
    Jul 2009
    Posts
    1,076
    Quote Originally Posted by skynet View Post
    Also be aware that the layer selection (that would be the mipmap selection for a real texture) the way you do it is only suited for the bilinear filter.
    You have noticed it correctly; my filtering is confined to a single layer only. But I have a reason for that. In this application, layers do represent the same spatial area, but the imagery can be totally different, ranging from satellite imagery to aero-photo, from near infrared (false coloring) to true color. There are gaps in layers. Also, they are made in different date/time intervals, hence some features do not exist in some layers, or are differently shaped. Using trilinear/anisotropic method to choose texels from different layers for the same spatial areas makes "islands" of "false texturing".

    Nevertheless, some kind of spatial coherency is achieved by using blending. What I mean will be (probably) clearer after seeing the following images: EC-1, EC-2, EC-3, EC-4, EC-5, EC-6, EC-7 and EC-8.

    For example, take a look at image EC-6. The airport runaway is from year 2000. The texture is a near infrared satellite image that I've recolored. The runaway is shorter with bomb craters. There is also some ghosting effect (a ghost-runaway parallel to the real one). On the aero-photo, the runaway is fixed, longer and with wider facilities around it. Blending according to spatial distance from the viewer is the only solution that gives more or less acceptable transition (fig. EC-7 and EC-8).

    Quote Originally Posted by skynet View Post
    If you want to have 'real' anisotropic filtering, you have to use the right math as well. Consult the EXT_texture_filter_anisotropic specs for that.
    I did, and made a short document for myself about anisotropic filtering.
    There is one question left. Maybe this is a right place to discuss about.

    In the spec, Px denotes distance in texture space in screen x-direction. Py is a distance in texture space down screen y-direction. Anisotropy-factor is calculated as:

    AF = N = min( ceil( max(Px,Py) / min(Px,Py) ), maxAniso)

    and Pmin is used for layer selection (hence, more detailed layer is used for texels fetch).

    On the other hand, in a NV example, anisotropy-factor is calculated as:

    AF = max(Px,Py) * max(Px,Py) / det

    where det is a determinant ( det = abs(dx.x * dy.y - dx.y * dy.x) ) of partial derivatives. Also, there are differences in layer selection.

    Both approaches come from NV, but the second is about 8 years "newer". Is there any official definition of anisotropy-factor, or it can be loosely interpreted?

    P.S. EXT_texture_filter_anisotropic spec contains false pronouncing of Greek letter lambda. Just a remark for younger readers.

  3. #13
    Advanced Member Frequent Contributor
    Join Date
    Apr 2003
    Posts
    661
    I don't think there's a definite spec on how anisotropic filtering had to be performed. EXT_texture_filter_anisotropic was the only document I was aware of. Where did you find yours?
    Do the Direct3D specs contain any word about it? The way I interpret the EXT_tfa math is: chose that level which spaces the maxAniso taps around 1 texel apart along the longest extend of the texel's footprint in texture space.

  4. #14
    Senior Member OpenGL Pro Aleksandar's Avatar
    Join Date
    Jul 2009
    Posts
    1,076
    Quote Originally Posted by skynet View Post
    I don't think there's a definite spec on how anisotropic filtering had to be performed. EXT_texture_filter_anisotropic was the only document I was aware of. Where did you find yours?

    Yes, there is no evidence of HW implementation of anisotropic filtering, or at least I haven't found any.
    Conclusion about how it might be implemented is coming from sample codes. Take a look at NV Clipmaps sample (more precisely, take a look at Clipmaps.fx, at function PS_Anisotropic). Although it is a D3D10 code, shaders can be easily translated to GLSL. In the accompanying document (pdf) the calculation is somewhat different, but the code is probably better source.

    Quote Originally Posted by skynet View Post
    The way I interpret the EXT_tfa math is: chose that level which spaces the maxAniso taps around 1 texel apart along the longest extend of the texel's footprint in texture space.
    Generally it is a correct interpretation, but it doesn't have to be 1 pixel precise. It depends on the extent of the anisotropy and maxAniso.
    According to EXT_texture_filter_anisotropic, anisotropic filtering uses N samples ( N = min(ceil(Pmax/Pmin),maxAniso) ) along greater (of the two screen-space) direction (not in texture space). U and V coordinates of the samples are calculated according to screen-space derivatives. Correct me if I'm wrong.

    Since lambda (LOD) is an integer value, both approaches probably choose the same LOD level for the sampling, but if math in the NV code example has better approximation (determinant is actually the size of the aniso-field), the trilinear filtering could give better output.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •