FPS while debugging

Hi! I can't seem to find an answer anywhere - when debugging what FPS is considered to be reasonable? At what point do i have to start to worry about optimization?
 
If your game can stay above your target FPS without issue, that's all that matters.
So if it is above 60 all the time, everything is good? My game is pretty small and i don't think FPS would ever drop below that, but i just sometime see people post a complicated thing they programmed and add something like "and it runs only at **** FPS in debugger" and i don't understand how good that number is.
 
So if it is above 60 all the time, everything is good? My game is pretty small and i don't think FPS would ever drop below that, but i just sometime see people post a complicated thing they programmed and add something like "and it runs only at **** FPS in debugger" and i don't understand how good that number is.
That number shows how fast your game could theoretically run, given an infinite framerate. It's mostly used for checking how optimized something is. The closer something is to the FPS of an empty project, the better optimized it is. Most of what you've seen is people flexing about how well they were able to make something complicated run. Generally though, this isn't the kind of thing you need to worry about!
 

Xer0botXer0

Senpai
I believe the human eye doesn't see faster than 60fps, after 60fps things seem to and run smoother.
Consoles like xbox and PS run at a capped 30fps.

What you could do to not actually have to worry about this is create an object if you haven't got one already, and do something like:

GML:
if fps_real <= room_speed

{     

show_message("FPS Alert.");

}
So this will pop up if you frames hit a low, you can then probably see what you've done to cause it.

Alternatively you could also draw the fps in game and monitor it.


Personally I'm not thinking about fps, I've never had to especially in 1.4(less buggy).
Keep on coding!
 

jerob1995

Member
Hi! I can't seem to find an answer anywhere - when debugging what FPS is considered to be reasonable? At what point do i have to start to worry about optimization?
As long as it at least meets the "room_speed" it is good (the default room speed is 60 steps per second in GMS:2). However that performance in testing will depend on your system. I usually like to test on lower-spec computers to see how it performs as that will make it clear if there are optimizations that can be made.
 
I believe the human eye doesn't see faster than 60fps, after 60fps things seem to and run smoother.
Consoles like xbox and PS run at a capped 30fps.

What you could do to not actually have to worry about this is create an object if you haven't got one already, and do something like:

GML:
if fps_real <= room_speed

{  

show_message("FPS Alert.");

}
So this will pop up if you frames hit a low, you can then probably see what you've done to cause it.

Alternatively you could also draw the fps in game and monitor it.


Personally I'm not thinking about fps, I've never had to especially in 1.4(less buggy).
Keep on coding!
The human brain doesn't calculate things in fixed ticks, and the world around us doesn't visually update at a fixed rate, so frames per second aren't really applicable to human vision. Unless there is something seriously wrong with their vision, humans can absolutely notice the difference between 60FPS and above. Otherwise, the world wouldn't look as smooth as it does! Most consoles run capped at 60 FPS, and newer ones like Xbox Series X and PS5 cap out at 120 FPS.

Minor corrections aside, I completely agree that FPS isn't a thing you should be thinking about throughout working on your project. It takes something extreme to drop FPS below expected. I also recommend doing what @jerob1995 suggests and testing out your game on a weaker system, but you shouldn't have to do that very often. I usually only do it after I'm almost done with my games.
 
Thanks guys! As my game gets bigger with more sprites, objects etc. I kinda started thinking about it because it has never been a thing i paid any attention to - just checked it fps would drop to 0 for no reason. Just was worried because with more and more things my code get messy and if it will have impact on performance. I've heard an opinion that you don't start worrying about optimization until there is a problem, but noone says when exactly this problem state starts.
 

Xer0botXer0

Senpai
The human eyes actually do see at around 60fps. We can only process so much information per second, It takes time for electrical signals to pass through the eyes as light hits the cornea, I suppose you can notice more or less information depending on your focus.

A good test to try which I've done before is attaching a motor to glasses, attaching a circular disk to the motor with a pizza slice cut out on one side. Turn the motor on and notice the effects. You'll receive light at a lower interval, how ever still be able to process the information, infact bird watching is fun on such an occassion.

If you want some fancy glasses I believe Nike created lenses that reduce the amount of frames one perceives. In fact by doing so you can start with 1fps, and move up till you eventually don't notice the difference and note at which fps things don't seem to change.

There are a lot of studies on it.
The short answer is that you may not be able to consciously register those frames, but your eyes and brain may be aware of them.

For example, take the 60-frames-per-second rate that many have accepted as the uppermost limit.

Some research suggests that your brain might actually be able to identify images that you see for a much shorter period of time than experts thought.

For example, the authors of a 2014 study out of the Massachusetts Institute of Technology found that the brain can process an image that your eye sees for only 13 milliseconds — a very rapid processing speed.

That’s especially rapid when compared with the accepted 100 milliseconds that appears in earlier studies. Thirteen milliseconds translate into about 75 frames per second.
I don't claim to be using the correct terminology, perhaps it has to do with flicker rate instead of frame rate. But further more if you've got an object moving at 60fps, but you're viewing it at 30fps then you'll only perceive fractions of the movement. If you perceive at 15fps you'll see gaps(looks like it teleports into the next key). Alternatively, if you're viewing at 60fps and the object moves at 30fps, you'll have more time to process the movement information.
 

woods

Member
don't start worrying about optimization until there is a problem, but noone says when exactly this problem state starts
short answer
there is no defitative line that says you should start looking at optimizing.


if your game tanks down to "unplayable" sure.. there is something amuck that needs your attention..

a couple things i found early in my learnings..
drawing something to screen over and over again ..such that it bogs down your machine's performance. //poor code that should have looped once or redrawn only after a change

having too many things outside of the room/view that are unimportant. //do you need 500 enemies off screen that the player prolly will never see? better to deactivate them when not needed
destroy lists and maps and such once they become irrelevant //after x hours my game bogs down to a crawl



but these are all things that can be fixed as they surface, or as the project starts to grow into a larger scope
dont worry about getting that extra 5 or 10 FPS... save that for the end game ..your working on minor bugfixing and fine-tuning at that point

keep on keepin on!
 

TsukaYuriko

☄️
Forum Staff
Moderator
Some things to note when working with FPS...

First of all, it's all relative. The FPS you see represents the theoretical FPS your game could run at before running out of time to render each frame. This means that, if your target FPS is 60 and your real FPS is 120, your game could run at up to 120 FPS (with all circumstances like what's happening in the game right at that moment considered). If your target FPS was 144, or you intend to support this frame rate natively as an option, though, you'd already be lagging.

That aside, it's also relative to the device the game is running on - that is, your device. If you're developing on a low-end device and get terrible performance, that doesn't mean that people on high-end devices will get equally terrible performance. On the other hand, if you're getting acceptable performance on your high-end device, that doesn't mean people on low-end devices won't be lagging.

Finally, differences in FPS are relative to the upper and lower ends of the difference and can not be measured by the actual difference. The difference between 60 FPS and 30 FPS is massive. The difference between 3000 FPS and 1500 FPS sounds massive, but is actually less of a difference than between 60 and 30. If we take a closer look at what these FPS values mean, we can figure out how long each frame takes (or has time) to be processed and rendered before things start lagging:

60 FPS is equivalent to roughly 16 milliseconds per frame.
30 FPS is equivalent to roughly 33 milliseconds per frame.

The difference in FPS is 30. The actual difference is that a frame has roughly 16 milliseconds less time to process and render at 60 FPS compared to 30 FPS.

3000 FPS is equivalent to roughly 0.3 milliseconds per frame.
1500 FPS is equivalent to roughly 0.7 milliseconds per frame.

The difference in FPS is 1500. The actual difference is that a frame has roughly 0.4 milliseconds less time to process and render at 3000 FPS compared to 1500 FPS.

This is why dropping from thousands of FPS to less thousands of FPS shouldn't make you panic, but getting anywhere close to your target frame rate - or below it - should, because you're about to enter lag hell.
 
Top