Flicker fusion frequency problem

Hi,

I am new here, I am now trying to render two images as fast as I can, so I used a teapot in OpenGL, and switching its color between green and purple in an idle function. My display card is Nvidia Geforce 310 and I have a CRT monitor(works at 75Hz) and DLP projector(works at 120Hz) connected to it.

When I put the render window in CRT(with Vsync on), the refresh rate is about 60Hz, I can see the flashing between green and purple. When I put the render window in projector(with Vsync on), the refresh rate is about 100Hz, and I can only see a white teapot(which is the sum of the green and purple).

  1. Can anybody tell me for each image(either green or purple teapot) what is the refresh rate? Is that just the (total refresh rate)/2?
  2. I want to use my camera to capture either one of the teapot, so I tried to use PGR firefly and run it at 60Hz fps, but it can’t distinguish the green and purple, instead it sees the same thing as me, white teapot. Why?
  3. I guess maybe it’s because the camera and projector are not synchronized. I searched the forum, some people say I need a Quadro card them the interval between the green and purple is stable, then I can use that interval to sync camera
    and projector. I wonder if I can do this without Quadro card?
  4. It is said that when the fps is higher than human’s flicker fusion frequency(75Hz), we will not see the flashing. But my CRT refresh rate is 75Hz, why can I still see the flashing?

Here’s my code, I calculate OpenGL fps using QueryPerformanceCounter.


#include <windows.h>
#include <iostream>
#include <GL/glut.h>
#include <stdio.h>
#include <time.h>

using namespace std;

double PCFreq = 0.0;
__int64 CounterStart = 0;

float CL=1;
float CR=1;

void StartCounter()
{
    LARGE_INTEGER li;
    if(!QueryPerformanceFrequency(&li))
	cout << "QueryPerformanceFrequency failed!
";

    PCFreq = double(li.QuadPart)/1000.0;

    QueryPerformanceCounter(&li);
    CounterStart = li.QuadPart;
}

double GetCounter()
{
    LARGE_INTEGER li;
    QueryPerformanceCounter(&li);
    return double(li.QuadPart-CounterStart)/PCFreq;
}


double CalFrequency()
{
     static int count;
     static double save;
     static clock_t last, current;
     double timegap;
     StartCounter();
     ++count;
     if( count <= 100 )
         return save;
     count = 0;
     last = current;
     current = clock();
     timegap = (current-last)/(GetCounter()*1000000);
     save = 100.0/timegap;
     return save;
}

void myDisplay(void)
{
     double FPS = CalFrequency();
     printf("FPS = %f
", FPS);
    glClear(GL_COLOR_BUFFER_BIT);

    glColor3f(CL,0.5,CR);
    glutSolidTeapot(0.5);
    glutSwapBuffers(); // for GLUT_DOUBLE
    //glFlush();// for GLUT_SINGLE
}

void myIdle(void)
{
     CL=-CL;
     CR=-CR;
	
     myDisplay();
	 
}

int main(int argc, char *argv[])
{
     glutInit(&argc, argv);
     glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE);
	 //glutInitDisplayMode(GLUT_RGB | GLUT_SINGLE);
     glutInitWindowPosition(100, 100);
     glutInitWindowSize(400, 400);
     glutCreateWindow("Time");

     glutDisplayFunc(&myDisplay);
	 
     glutIdleFunc(&myIdle);
     glutMainLoop();
     return 0;
}


You can test my code to see if there’s something wrong.

Thanks,

First, if you’re going to time the redraw rate, put glFinish() after SwapBuffers. Otherwise you’re not timing what you think you are.

When I put the render window in CRT(with Vsync on), the refresh rate is about 60Hz, I can see the flashing between green and purple.

Disable that annoying Windoze compositor (Aero or whatever they’re calling it nowadays). That’s probably in your way. This compositor causes the OS display manager to “take control” of vsync meaning you don’t get to use it. It virtualizes it and does all kinds of screen compositing monkey-business with your window’ output behind-the-scenes in the display manager. All-in-all, it’s a waste of GPU cycles in the name of eye candy.

With the above changes (i.e. really timing the vsync rate, and directly driving the GPU output), if your timer is registering 60Hz, then you’re not doing 75Hz. All but very old CRTs typically are multiscanning, meaning you can set them to a range of horizontal and vertical refresh rates and they’ll “sync up” to that signal just fine. Check your GPU control panel and make sure that you have it set to generate a 75Hz signal.

Here, on a multiscanning CRT monitor driven by a GPU output which is nailed to 97Hz (and no annoying compositor in-the-way), if I ditch your timing logic and plug in something that works on Linux, I measure 97Hz +/- 0.4ms for the redraw rate. I don’t see any flashing. I don’t see white either (that’d be wrong). I see a blend between dim green and light magenta.

It’s also someone odd that you’re setting a color of -1,0.5,-1 to get dim green (0,0.5,0).

  1. Can anybody tell me for each image(either green or purple teapot) what is the refresh rate? Is that just the (total refresh rate)/2?

Your refresh rate is approx what you measure when your redraw loop is locked to vsync (or if you snap a scope on your video output). What the screen contents are each scan-out is not relevant to the refresh rate.

  1. I want to use my camera to capture either one of the teapot, so I tried to use PGR firefly and run it at 60Hz fps, but it can’t distinguish the green and purple, instead it sees the same thing as me, white teapot. Why?

Could be a number of things. Could be your camera’s exposure start is not synchronized to vsync. Could be its exposure is longer than the refresh rate. Could be the persistence on the phosphors of your CRT/display is long enough that it’s blurring the illumination from multiple refreshes together significantly. etc.

  1. I guess maybe it’s because the camera and projector are not synchronized.

Maybe. Maybe not. I’d check the options and see.

I searched the forum, some people say I need a Quadro card them the interval between the green and purple is stable, then I can use that interval to sync camera
and projector. I wonder if I can do this without Quadro card?

What you officially need a Quadro for is when you are synchronizing two or more GPUs so that their video scan-out clocks are synchronized (for genlock). This is different from (though similar to) synchronizing some external event to the frequency of a scanned-out video signal.

  1. It is said that when the fps is higher than human’s flicker fusion frequency(75Hz), we will not see the flashing. But my CRT refresh rate is 75Hz, why can I still see the flashing?

Sounds like from your testing it probably isn’t currently set to 75Hz but 60Hz instead (or you haven’t disabled the compositor). Check your GPU control panel to verify the refresh rate your GPU is driving it at.

Thanks for detailed reply!

As you said, I put glFinish() after glutSwapBuffers(), and the output FPS is about 50Hz on my CRT, and around 50Hz on my projector. Maybe that’s why I can see the flash on CRT but not on projector.

Here, on a multiscanning CRT monitor which is nailed to 97Hz, if I ditch your timing logic and plug in something that works on Linux, I measure 97Hz +/- 0.4ms. I don’t see any flashing. I don’t see white either (that’d be wrong). I see a blend between dim green and light magenta.

I don’t know how you measured the same FPS as you nailed, maybe my timer under windows is not as accurate as your Linux one. I fixed my CRT to 75Hz and projector to 120Hz in Display Properties, but the calculated FPS is smaller than the fixed value which are 50Hz(on average) for CRT and 80Hz(on average) for projector. Is that because I turned Vsync on so the system clamped the FPS to a smaller value to achieve Vsync.

What you officially need a Quadro for is when you are synchronizing two or more GPUs so that their video scan-out clocks are synchronized (for genlock). This is different from (though similar to) synchronizing some external event to the frequency of a scanned-out video signal.

As you said actually I don’t need a Quadro? I am using Point Gray camera now and I can trigger this camera by external signal or camera register write, so I think once I know the FPS of my display, I can use a timer to trigger my camera at the rate of FPS i.e, synchronizing camera and display to capture both green and magenta teapot image. Can I realize that?

Thanks,

It’s not just the frequency you need but the phase as well. Even if your exposure is short enough, and even if they’re being taken at the same rate, how do you know you won’t take a picture that straddles two frames?

If you’re an EE or know one, you can probably rig up something that’ll kick out a timing pulse each time vsync on the video signal happens. Then you can trigger your camera exposure based on that. There’s probably stuff out there on the net along these lines as well – I’ve just never looked for it.

Don’t forget to consider the phospor/display persistence issue though (think “glow”). The display intensity at a pixel is not a square wave with instantanteous falloff. You want to make sure your test is even reasonable before you go to the trouble to rig it up.

Thanks Dark.

It’s not just the frequency you need but the phase as well. Even if your exposure is short enough, and even if they’re being taken at the same rate, how do you know you won’t take a picture that straddles two frames?

You are right, since my PGR camera can be triggered externally and I can specify the time delay, in seconds, from when an external trigger event occurs to the start of integration (when the shutter opens), this won’t be a problem.

Another thing, do you know how to draw the image with a fixed delay. i.e, no matter what’s going on in my PC the OpenGL redraw rate doesn’t slow down? In my code, I use the glutIdleFunc, which means when there’s some other programs running, the drawing speed slows down, so I wonder if there’s other way to keep this speed stable. I have tired glutTimerFunc, but the drawing speed is slow, even when I set delay to 1 ms I can still see the flash.

Thanks,

Another thing, do you know how to draw the image with a fixed delay. i.e, no matter what’s going on in my PC the OpenGL redraw rate doesn’t slow down?

Ok, there are two things here. First one (often called the vsync rate) is the rate that the image is scanned out the DVI/HDMI port (this is what you’re talking about). The second is the rate that your program is changing the contents of the system framebuffer that is being scanned out. These two aren’t necessarily linked, though they should be for artifact-free rendering.

The scan-out rate (aka vsync rate) that your GPU generates is determined by the GPU driver, often in combination with querying (or being given) the scan-out frequency capabilities of your monitor. Once set, this is relatively fixed rate. It’s not perfect, but it’s close enough for display purposes.

You can link the second rate (the rate your program is changing the contents of the system framebuffer) to the vsync rate by enabling the driver feature termed “sync to vblank”, which synchronizes buffer swaps to the vsync rate of your GPU’s output signal.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.