Taking a moment's reflection, Sleepy mused:
|
| yes but why do the frames drop by half ? I dont know and you dont
| appear to either.
Ignoring your terseness, I was attempting to give you a general
answer in the interest of time. If you want specifics, all you have to
do is ask ... Vsync attempts to keep your video card's output in sync
with the refresh rate of the monitor. Mathematically, and for the
purposes of this discussion, the formula used for calculating "sync"
determines that there are two values that are considered "in sync." One
is the value of the refresh rate, and the other is the value of refresh
rate /2 (or half). In short, it drops to half because half is still
sync'd.
So, as with my previous example, you are playing along at 60 FPS
because your monitor is set for 60 Hz refresh rate, and Vsync is
enabled. In your game, something large blows up, and your video card
can only render the scene at 55 FPS. Since it cannot reach 60 FPS, the
video subsystem will flush all the frames between 30 and 60, and drop
your FPS display down to 30 ... thus remaining in sync with the monitor.
If you'd like more information regarding this, then Google is your
friend.