• Hey Guest! Ever feel like entering a Game Jam, but the time limit is always too much pressure? We get it... You lead a hectic life and dedicating 3 whole days to make a game just doesn't work for you! So, why not enter the GMC SLOW JAM? Take your time! Kick back and make your game over 4 months! Interested? Then just click here!

unexpected low fps

Hello everyone,

I have a strange problem with the frame rate in my game which I cannot figure out.

I am developing on a strong machine. Both when started from the engine and compiled to a windows executable the fps_real are at about 220 and the fps reaches a constant 60. But when I run the same executable on another, not that powerful computer, the fps_real are at about 100, but the fps drop down to 30 and the game gets very jittery.
From what I understand how fps and fps_real work, I would expect that on the slower computer the frame rate should still be at about 60 or at least close to it. I don't really know how to approach this problem, am I missing something? Any pointer would be really appreciated!
 

TsukaYuriko

☄️
Forum Staff
Moderator
Do you have vsync enabled? If your game doesn't run stable at the target frame rate and you have vsync enabled, it will gradually decrease the target frame rate until it does run stable. A drop from 60 to 30 is a prime example of this.
 
Thanks a lot for to pointer to vsync, it's much appreciated!

I did not know about it before, but if I understand correctly from the documentation, I can set it via display_set_timing_method, but by default it is on, correct? So from what I understand my game is too slow on the second computer, so it automatically slows it down to be smooth. What I do not understand though is, that the fps_real is well above the target 60. Is that because there might be spikes that cause it (path finding for example)? Or is it about graphics calculations that don't reflect in the frame rate like shaders (I've read about some things are not visible in the fps that Game Maker shows)?

From what I understand now is that I could either disable vsync which should raise the average fps, but might result in jitter when single frames are slow OR just optimize the game (graphics?) more so the vsync does not kick in and reduce the frames in the first place. Do I understand this correctly?
 

TsukaYuriko

☄️
Forum Staff
Moderator
I did not know about it before, but if I understand correctly from the documentation, I can set it via display_set_timing_method, but by default it is on, correct?
Now that's a function I didn't know about before... :p (I'll have to admit I have no idea what that one does under the hood.)

What I was talking about is the vsync setting that can be found under Game Options -> Windows -> Graphics -> Use synchronization to avoid tearing, which can be set via display_reset.

Note that this is only one part of "vsync" related settings. The other one is completely outside of GM's control, and that is the system-wide one set in your graphics card's control panel. This may take precedence over the game's setting.

So from what I understand my game is too slow on the second computer, so it automatically slows it down to be smooth. What I do not understand though is, that the fps_real is well above the target 60. Is that because there might be spikes that cause it (path finding for example)? Or is it about graphics calculations that don't reflect in the frame rate like shaders (I've read about some things are not visible in the fps that Game Maker shows)?
Sounds about right. There may be outlier frames here and there that take longer to process than a frame should (whether that's due to the CPU or GPU processing more than they can handle during that frame). This may be sufficient to trip vsync into lowering the frame rate - and drastically so, as it is lowered in steps of (monitor's refresh rate / n), where n starts at 2 and increases by 1. So if your monitor runs at 60 Hz, the first drop will be to 30 FPS, the second one to 20, the third to 15, and so on.

From what I understand now is that I could either disable vsync which should raise the average fps, but might result in jitter when single frames are slow OR just optimize the game (graphics?) more so the vsync does not kick in and reduce the frames in the first place. Do I understand this correctly?
Correct. An ideal starting point would be to run your game in debug mode and using the profiler to check what's draining the most performance (in general or during specific frames that take longer to process than usual).
 
Again, thank you very much for taking the time to explain these concepts.
I will play around with the settings and see how it effects the frame rate and also have a look at where I loose the performance. I already used the profiler to increase the performance where I am now (even the fps_real where below 60 before I started), but there seem to be no outliers anymore, it's more that everything takes more or less the same amount of time. I do have a pathfinding that kicks in from time to time, but this should load balance it's calculations over several frames. Also there are some shaders, maybe there is something wrong.
 

Yal

🐧 *penguin noises*
GMC Elder
Vsync is such a big issue in every game ever that the first thing I do after remapping the controls to match the Dark Souls layout is to turn it off. Several big AAA games I've played recently has had reports of lag or even crash issues with vsync turned on, and that's not exactly made me less against it...

Some other stuff worth keeping in mind:
  • Having a too small sleep margin will make Windows turn your game into a background process, which is awful for performance. (This mostly happens on MORE powerful computers, though). Try setting it to any number over 1000 ms (so the game never sleeps and hogs all available processor time) and you might see an improvement.
  • Some effects are more GPU-heavy (shader code, submitting/rendering large vertex buffers, having so many large textures they don't all fit in VRAM and need to be replaced constantly) and these can have massive effects on a per-machine basis (the cheapest PCs don't have a graphics card and the CPU just does all the GPU calculations itself, while big gaming rigs have so overpowered GPUs they trivialize even the original Crysis shaders)
I think fps_real is supposed to be "how many frames GM could output if it was the only process in the system", so if fps_real > intented framerate > actual fps, I would guess there's some resource conflict in the OS that leads to the bottleneck... and the GPU or VRAM would be my first guess.
 
Thanks Yal for the additional input, can't wait to test it all out when I am back from my day job.
It's actually an older notebook and it's quite probable that it does not have a dedicated graphics card. Also I remember increasing the size of the texture page at some point of the project - so maybe this is the price I have to pay now...
 
Finally I was able to test is all out. The graphics chip on the slow computer seems to be the bottle neck, it only has 64MB VRAM. I tried to decrease the size of the texture pages, but this leads to a big increase of swaps which outbalances the gain, the framerate stays at about the same. It's all one big map in my game, so I don't have big hopes for organizing the texture pages better. I did improve some other things and made some heavy graphical features optional, so I am now at about 370 fps_real on the strong computer and at about 120 on the slow one. The fps at constant 60 on the strong and at about 45 on the slow one.

Now I checked and found that I use display_reset nowhere in my code and "Use synchronization to avoid tearing" is and was disabled. I also found the setting on the slow computer and it is set to "application", which implies to me that vsync should not be enabled. BUT the behavious matches exactly what you wrote, TsukaYuriko: While the fps are about 45 for a long time, there is always this one point (I guess paths are calculated) where it drops quickly down to 30 - and stays there. No more special calculations and the debug overlay clearly shows no spike, fps_real being again at about 120, but the fps stays at 30.

Can I somehow check if vsync is maybe active anyhow?
 

Nocturne

Friendly Tyrant
Forum Staff
Admin
Can I somehow check if vsync is maybe active anyhow?
It could be that the graphics card drivers are forcing Vsync? I know that most driver utilities permit the driver to override game settings, so maybe check that... and if it is, switch it off (or set it to application controlled) and see if it improves anything. Note that there's nothing you can do about this kind of driver level tinkering in your game...
 
I know that most driver utilities permit the driver to override game settings, so maybe check that... and if it is, switch it off (or set it to application controlled)
Yes, I found that setting and it is already set to "application".

Note that there's nothing you can do about this kind of driver level tinkering in your game...
I am aware that I can do nothing about these settings. My reasoning is just that I have a good example of what users will experience later so I want to understand as good as possible what's going on and how I can improve the situation as good as possible.
 
I'm wondering if it's the monitor refresh rate capping the speed of the game?
I checked that yesterday and the refresh rate of the display is set to 60Hz. I am going to test out what is described in the thread though, maybe that helps. Thanks for the pointer, it is very appreciated!
 
Top