Why does Vsync cost so much FPS when my game never reaches my monitor refresh rate?

:information_source: Attention Topic was automatically imported from the old Question2Answer platform.
:bust_in_silhouette: Asked By Macryc

Maybe a stupid question and maybe I’m not understanding vsync but I thought that vsync only reduced the FPS of a game when the monitor’s refresh rate couldn’t keep up with the fps.
By this logic, if the monitor’s refresh rate is constantly higher that the game’s FPS, vsync should not do anything?
My monitor is 240hz and my game runs at 200FPS max. I have vsync on and it’s costing me lots of performance (if I disable vsync, the game runs at 400FPS). In this case, what is vsync doing exactly that is so expensive? Should it not be dormant when there is no FPS to reduce?

Standard V-Sync will introduce stuttering when your framerate goes below the monitor refresh rate (when not using a variable refresh rate monitor). Therefore, if you cannot sustain the monitor refresh rate at all times, I recommend disabling V-Sync for a smoother experience. On a 240 Hz monitor, consistently reaching the required frame time (lower than 4.16 milliseconds) is very difficult, even on fast hardware in a 2D game.

Calinou | 2021-11-18 20:45

:bust_in_silhouette: Reply From: sash-rc

Vsync doesn’t “cost”. Instead, it limits framerate, and thus offloads cpu/gpu of otherwise redundant updates.