Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 4 of 4

Thread: Smudge Effect

  1. #1
    Junior Member Newbie
    Join Date
    Mar 2014
    Posts
    2

    Smudge Effect

    Hi all,

    I want to implement a smudge effect like photoshop's (can't paste a link here, but a quick youtube search will give you the example)
    I think that the ideas outlined in this article point me in the right direction, but I want to be sure before digging into the problem: (again, can't paste a link, just google for how to implement smudge and stamp tools, in lostingfight)

    Which would be the right approach to implement it using OpenGL?
    I know that I would have to make a "stamp texture" by reading the pixels around the finger tap location. This can be done by changing the viewport and the projection matrix to make sure that only the pixels around the tap location will be drawn, and then draw in a separate buffer.

    But where to go from here? How can I compose the original image with the smudge stamps that I'm generating while dragging? Does it make any sense to compose the image in the fragment shader? I don't think so. How about composing the image in CPU and then re-upload the resulting image to the GPU via glTexSubimage2D()?

    What happens if I generate a smudge stroke, but I want to draw another? It seems that I should use the result of the first as input to the second, right?

    Anyone can point me in the right direction?


    By the way, I'm working on an iPad Air, iOS 7, and OpenGL ES 2.0.
    Thanks,
    che1404.

  2. #2
    Senior Member OpenGL Pro
    Join Date
    Jan 2012
    Location
    Australia
    Posts
    1,117
    I would look at a compute shader to do a smudge since you need to access surrounding pixels in the texture.

  3. #3
    Junior Member Newbie
    Join Date
    Mar 2014
    Posts
    2
    Quote Originally Posted by tonyo_au View Post
    I would look at a compute shader to do a smudge since you need to access surrounding pixels in the texture.
    Thanks for the input.
    Which will be the exact approach for that?
    For smudging I would need to sample from the screen while dragging the mouse, and use a different texture along the path (it's like a special case of simple stamping, in which I only have the same brush texture). The rendering would be like this while dragging:
    - Render the background quad with the texture I want to smudge
    - For each point in the dragging path
    - Render to a separate texture the surroundings of the last mouse location (this will be the stamp texture I would blend in the shader)
    - Then render again to a texture, passing the shader both textures(the original, and the stamp), and the current mouse location. In the shader, if the current fragment is inside the box determined by the current mouse location (as center) and a given offset, apply the stamp texture (with the corresponding transformations), if not, apply the original.
    Do you think it could work?
    Thanks in advance,
    che1404.

  4. #4
    Senior Member OpenGL Pro
    Join Date
    Jan 2012
    Location
    Australia
    Posts
    1,117
    With compute shaders or OpenCL or CUDA you do not need to render texture to manipulate them. There are just buffers you attach to the program. For image processing I would look at the way a program like Photoshop works. Each layer is a texture. You apply whatever affect you want to the texture then do a whole screen quad render with this texture. If you have multiple layers, you would create a temporary texture that you merge the layers into and render this texture. If you have an nVidia card have a look at https://www.opengl.org/registry/spec...n_advanced.txt as well

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •