• Hey Guest! Ever feel like entering a Game Jam, but the time limit is always too much pressure? We get it... You lead a hectic life and dedicating 3 whole days to make a game just doesn't work for you! So, why not enter the GMC SLOW JAM? Take your time! Kick back and make your game over 4 months! Interested? Then just click here!

Legacy GM Debug FPS vs. fps_real

V

Vikom

Guest
Hi!
I'm trying to optimalize the performance from the very begining of the project.
I got surprised that the debug overlay's FPS was lower than the fps_real which was currently higher than the room_speed. (I believe that the "debug fps" == "fps")
Does it mean something bad? The fps_real itself has no really bad values at all.

If it's bad, is there something that can be done about it?
 

Attachments

Mert

Member
The easiest explanation is;

You computer can actually run the game at <fps_real> FPS, but you choose to make it run <60> fps (for example). What you see in the debug is the second value.
 
V

Vikom

Guest
Thank you.
I actually understand this. I don't get why FPS is 42, while room_speed is 60 and fps_real is 982.

I thought that...
if fps_real > room_speed
{
fps = room_speed;
} else {
fps = floor(fps_real);
}
 
T

Taddio

Guest
Straight from the manual:

Syntax:
fps_real

Returns: Real

Description
In GameMaker: Studio there are two main ways that can be used to tell the speed at which your game runs. The room_speed (as specified in the room editor) and the fps (frames per second). These values are often confused, but basically one is the number of game steps that GameMaker: Studio is supposed to be completing in a second (room speed), while the other is the number of CPU steps that GameMaker: Studio is actually completing in a second (the real fps), and this value is generally much higher than the room speed, but will drop as your game gets more complex and uses more processing power to maintain the set room speed.
 

RangerX

Member
Thank you.
I actually understand this. I don't get why FPS is 42, while room_speed is 60 and fps_real is 982.

I thought that...
if fps_real > room_speed
{
fps = room_speed;
} else {
fps = floor(fps_real);
}
This sounds like a cycle problem or maybe v-sync. What if you run your game with v-sync off? Is it full 60?
Another test you can have is to change your step margin to 15. Thats the minimum time GameMaker will wait before having another cycle. It should fall in line with your computer more easily (maybe)
 
V

Vikom

Guest
This sounds like a cycle problem or maybe v-sync. What if you run your game with v-sync off? Is it full 60?
Another test you can have is to change your step margin to 15. Thats the minimum time GameMaker will wait before having another cycle. It should fall in line with your computer more easily (maybe)
Thank you, sounds like a clue.
Sorry for a silly question, I've never been working with v-sync.
display_set_windows_alternate_sync(0);?

Ok, I've tried the sleep margin 15... Debug FPS: 42, FPS 41, fps_real 1316.

Just about circumstances. In the room there's a 3D terrain object with 100x100 3d terrain + a model of "baked together" 500 semi-transparent trees made from 4 walls (you know, a both-sided 'X' model) My GPU gets frying because of the semi-transparency.
Probably nothing important but still wanted to mention it.
 

Smiechu

Member
Thank you, sounds like a clue.
Sorry for a silly question, I've never been working with v-sync.
display_set_windows_alternate_sync(0);?

Ok, I've tried the sleep margin 15... Debug FPS: 42, FPS 41, fps_real 1316.

Just about circumstances. In the room there's a 3D terrain object with 100x100 3d terrain + a model of "baked together" 500 semi-transparent trees made from 4 walls (you know, a both-sided 'X' model) My GPU gets frying because of the semi-transparency.
Probably nothing important but still wanted to mention it.
Yeah... nothing important, who would have care about such nuances.
So the main thing is, fps real is only relevant for cpu load... if gpu is overloaded, fps real will not tell you that.
 
D

dannyjenn

Guest
I had this same problem... the fps was lower than the game speed even though the fps_real was in the 900s. I never did figure it out. (I was using a shader, so maybe the GPU was slowing it down...) But I noticed that when I export the game using the VM, it's kind of laggy... but when I export it using the YYC, it runs smooth.
 
Last edited by a moderator:

RangerX

Member
Thank you, sounds like a clue.
Sorry for a silly question, I've never been working with v-sync.
display_set_windows_alternate_sync(0);?

Ok, I've tried the sleep margin 15... Debug FPS: 42, FPS 41, fps_real 1316.

Just about circumstances. In the room there's a 3D terrain object with 100x100 3d terrain + a model of "baked together" 500 semi-transparent trees made from 4 walls (you know, a both-sided 'X' model) My GPU gets frying because of the semi-transparency.
Probably nothing important but still wanted to mention it.
There you go. You found your problem. Your CPU can run the game easily but your GPU is rushing, hence the bad framerate. Find another way than transparency galore maybe? I am not knowledgeable much in 3D but there are solutions that's for sure.
As for v-sync, that's the function you want to use:
https://docs.yoyogames.com/source/dadiospice/002_reference/windows and views/display_reset.html
 
V

Vikom

Guest
Thank you!
For some reason I believed that the GPU's load is not that important.

I've lowered the view_porth/w from 1366x768 to 600x400 (this project is just a map editor, so it doesn't matter) and now it can handle even 5000 trees (500 before) with GPU use at only 80%, according to task manager.
I have an average 450$ laptop from spring 2016, so I think the result is not bad at all and it will be better at more competent machines when the resolution is high again.
 

Attachments

Top