• Hey Guest! Ever feel like entering a Game Jam, but the time limit is always too much pressure? We get it... You lead a hectic life and dedicating 3 whole days to make a game just doesn't work for you! So, why not enter the GMC SLOW JAM? Take your time! Kick back and make your game over 4 months! Interested? Then just click here!

Question about 60fps vs 30fps

Stubbjax

Member
Yeah sure, game dev is hard. But the way I see it, delta time adds unnecessary complexity as part of the solution to a problem that I really just don't see as much of a big deal. Lag in all its forms is bad, and I find the idea of easier development + consistency much more preferable to the alternative.

Sure, you can use delta time to "cope" with unstable frame rates, but I see unstable frame rates as the real problem, and delta time as the band-aid - merely a cover for the underlying problem of an unoptimised game. I have used delta time extensively in Unity, but even then I felt the need to use 60 * Time.deltaTime so that I could use all the same frame-based values as I was used to in GM.

I am aware that many AAA titles can swap between 30 and 60 fps, but that may be irrelevant as the game loop can often run at 30 fps but render at 60 fps. For example, several games I know run with a fixed game loop speed of 30 and can render at 60 fps, but if the frame rate drops below 30, the game begins to slow down.

Using delta time just seems so messy to me, but perhaps I have never been introduced to a proper implementation. So many things come to mind that just don't seem possible when regarding delta time, so maybe I'm thinking of delta time from the wrong perspective. Either way, I'm fairly certain games that use delta time cannot 100% guarantee that the game logic will output exactly the same results, as well as the fact that it's quite simply much more work to implement. And those two facts alone are enough to put me off.

And I wasn't refering to just join date, but post count as well!
 

GMWolf

aka fel666
I am aware that many AAA titles can swap between 30 and 60 fps
Not just swapping between FPS. they can run at any fps (above 20 or so) and still deliver the same logic.
I tend to run my competitive games at ~ 70 fps, on a 60Hz monitor.

Using delta time just seems so messy to me, but perhaps I have never been introduced to a proper implementation.
Probably. Delta timing actually seems neater to me as it introduces the dt term back into the mathematical equations.

So many things come to mind that just don't seem possible when regarding delta time, so maybe I'm thinking of delta time from the wrong perspective.
Such as what? Im fairly certain you can deal with practically any problem.

Either way, I'm fairly certain games that use delta time cannot 100% guarantee that the game logic will output exactly the same results, as well as the fact that it's quite simply much more work to implement.
They can guarantee effectively identical game logic.
Its not that much more work. you just re-introduce the dt terms in your integrations, And use a Time instead of frmase_passed for timers.

Really, it just sounds to me like you are saying Delta timing is bad because its difficult! Its just something you need to learn. And it really isn't that hard.

And I wasn't refering to just join date, but post count as well!
Oh wow!
 

Nocturne

Friendly Tyrant
Forum Staff
Admin
Okay, this topic has been tidied and some posts removed at a user's request. Sorry if this makes things slightly harder to follow, but I think there is still merit in the discussion and most of the interesting points are still visible and relevant, so it's staying open for now.
 

RangerX

Member
General consensus was that it looked awful and cheap, actually. I heard a lot of people comparing it to "Telemundo soap operas," hahah.
Most probably a good sign. People are associating "24 fps" with "cinematic feel" since 70 years +.
Also, the camera used for filming at home of certain TV series are at 30fps so that smoothness is also associated with "cheaper" in people's mind.
Can't blame them but it would be time to get into "smooth world" now. Full HD (or more) and 60 fps.
The more lifelike a movie is, the better it should be.
 
M

Misty

Guest
General consensus was that it looked awful and cheap, actually. I heard a lot of people comparing it to "Telemundo soap operas," hahah.
Basically, yes.

Most movies are fantasy...the low frame rate of a movie creates a sluggish feeling, and makes the person sleepy and can help them get into the fantasy of the movie.
FPSs like Doom and Heretic don't really need a high fps, because it is like a fantasy fps.

Documentaries, you want in a high fps.
Racing games, flying games, realistic fps games, you want in a high fps. Because you want to feel like its "real life" and not in a sluggish fantasy.

Fighting games and adventure games, they get good ratings even at 20 fps. OOT got game of the year at 20 fps, because it was more like a story than a simulation.

Most probably a good sign. People are associating "24 fps" with "cinematic feel" since 70 years +.
Also, the camera used for filming at home of certain TV series are at 30fps so that smoothness is also associated with "cheaper" in people's mind.
Can't blame them but it would be time to get into "smooth world" now. Full HD (or more) and 60 fps.
The more lifelike a movie is, the better it should be.
No it is not only a cultural phenomenon, the reason people don't like 48 fps movies is because the way the brain inherently operates. Below 30 hz is a trance state. And the higher framerate you get, it becomes more vivid and less drowsy...you don't want to put movie-watchers in a "hyper-awake-state" while engaging them in a fantasy. 60 fps looks like home-video and loses the epic feel of it. The people just look like ordinary people that way.
 

RangerX

Member
No it is not only a cultural phenomenon, the reason people don't like 48 fps movies is because the way the brain inherently operates. Below 30 hz is a trance state. And the higher framerate you get, it becomes more vivid and less drowsy...you don't want to put movie-watchers in a "hyper-awake-state" while engaging them in a fantasy. 60 fps looks like home-video and loses the epic feel of it. The people just look like ordinary people that way.
I still think that if movie had always been 60 fps, you guys would all wish to have 60fps. Its clearly an habit.
Else that logic would apply to everything. 60 fps videogames would look cheap and non-epic, real life would totally not be epic, etc.
 
Last edited:

GMWolf

aka fel666
I still think that if movie had always been 60 fps, you guys would all wish to have 60fps. Its clearly an habit.
Else that logic would apply to everything. 60 fps videogames would look cheap and non-epic, real life would totally not be epic, etc.
I dont think anyone here suggested 30fps to be better in games seriously.
 

RangerX

Member
I dont think anyone here suggested 30fps to be better in games seriously.
No but interesting anecdotal evidence: Gran Turismo (notorious for 60fps racing beauty) had replays at 60fps for like the first 4 games. Now with Grand Turismo 5 on PS3, they said after much feedback from users, they did put the replays at 24fps + blur for it to be "more cinematic". Basically yes people sometimes prefer lower FPS in videogames and that solely because (at least in my example here) because their brain associate "good cinematic feel" and "24 fps +blur".
 
No but interesting anecdotal evidence: Gran Turismo (notorious for 60fps racing beauty) had replays at 60fps for like the first 4 games. Now with Grand Turismo 5 on PS3, they said after much feedback from users, they did put the replays at 24fps + blur for it to be "more cinematic". Basically yes people sometimes prefer lower FPS in videogames and that solely because (at least in my example here) because their brain associate "good cinematic feel" and "24 fps +blur".
It could be because of habit, but I don't think so. Even famous animators have said that using a lower framerate makes an animation feel chunkier and more convincing, rather than feeling floaty and weightless. I really think there's something in a lower framerate, rather than it just being an arbitrary number. I used to work in an electronics shop - when we first had 120hz TVs in, everyone who saw the higher framerate demos said it looked weird and unnatural. I think there's a kind of "hyper real" uncanny valley going on with high-frame cinema. If I had to guess why games don't suffer from this effect, it's probably because games
1.)Don't look very realistic compared to real life, so no uncanny valley going on, and
2.)Games are controlled by the player, so any feeling of "this is weirdly smooth" is overrun by "this game is responsive and reacts well to my inputs."

I dunno, heheh. It's hard to find anyone who's unbiased about this, since like you say, cinema has been 24fps forever.
 
Last edited:
S

seanm

Guest
60fps looks really, really ugly most of the time, it has more input lag than high framerates as well. (even if you're running higher framerates on a 60hz monitor.)
But most people use 60hz monitors and are accustom to how terrible everything looks and feels on them anyways. You absolutely should not make a 30fps game that involves any kind of action, but most people aren't going to complain about 60fps.

I'll complain about 60fps though.

Aesthetically, its not so bad on console since you sit so far away from the screen. But on PC 60fps looks terrible if there are any large moving visuals (like the camera moving around) FPS games, platformers, adventure games, etc.

Delta timing isn't even really that annoying to do in gamemaker, its just a bummer the logic and draw loops are connected.
Actually, are they connected in GM2? If GM2 can handle multiple refresh rates better than GMS I'll finally be able to stop thinking about switching to unity; and I hate using unity.


Oh, and running a game at 60fps is actually fine, so long as the screen is allowed to update at higher monitor refresh rates. This is not something that GMS allows, so it's not really relevant. But important to know. It seriously cuts down on jittering, blurriness, and ghosting.

Tl;DR: The NES ran at 60fps. Dont make a 30fps game for the love of god.

"The difference between 60 and 120fps is so small no one will even notice anyways"
Always makes me laugh when I hear that.
 

RangerX

Member
"The difference between 60 and 120fps is so small no one will even notice anyways"
Always makes me laugh when I hear that.
I sort of was waiting for you in thread! LOL
You are one sensitive mofo with framerate.
I probably see a difference between 60 and 120fps but sure don't give a damn. And since 99% will probably think that too, making a game 120fps constant is a waste of development time.
To each his own tolerance I guess.
 
I sort of was waiting for you in thread! LOL
You are one sensitive mofo with framerate.
I probably see a difference between 60 and 120fps but sure don't give a damn. And since 99% will probably think that too, making a game 120fps constant is a waste of development time.
To each his own tolerance I guess.
Ironically, you could say the same about 30 to 60fps, too. X'D
More like 80%-90% probably (though I'm guessing more like 90%), but still, I don't think the vast swaths of people playing Undertale, CoD, Madden, Minecraft, etc etc care all that much about having 60fps. Like I mentioned earlier in the thread, Bloodbourne and Breath of the Wild were both huge critical successes despite their 30fps framerates, hahah. I've never heard of game that got critically or popularly panned for only being 30fps, though I have heard of games being panned for not being pretty enough.
 
S

seanm

Guest
Honestly though @RangerX , you're right. High refresh rate monitors make up around 1-2% of the gaming monitors on the market. (At least for now, that number is likely going to skyrocket in the coming years.)

The differences are huge and tangible for both aesthetics and input lag; but most people dont even know what framerate is, so they'll just have some nebulous understanding of it. "floaty controls, bad graphics, hard to follow the action, etc."
 

RangerX

Member
I doubt it will caught one. Those are false needs that people don't really have in my opinion.
There's an education that is not there, a sensitivity to this that most people don't have. Those super monitors (just like 4K tvs) will rake in numbers only because at some point it the only monitors that people will be able to buy.
For something to catch mass market, people must have a need for it or the maker must teach people into having that need. I just don't see 120fps (or 4K resolution for that matter) ever becoming a real need.
And needs aren't infinite. The more comfortable people get, the less you can create them more needs. Just a quick example with music quality. Until we did reach "CD quality" for music, people were always wanting better sound (because honestly, some old turn tables where sounding really bad and magnetic cassettes were downright horrible). But when "CD quality" happened, people got satisfied and sound quality pretty much ceased to evolve. Now they just playing with sound display techniques, 5.1, 10.1, etc etc. Sample quality is not a real concern anymore and our music material today barely sounds "better" than "CD quality" even after like 25 years. Why? because the need for "better sound" is not there in people. They are satisfied so we can't really create them more "needs".

This phenomenon will happen with resolution, graphic quality, every aspect of artistic products will inevitably reach that point someday where there will be no need for "better".
I think the last resolution that will truly matter is 1080p. 60fps will probably be the last FPS that will truly matter too. I would believe in people using different methods of displays than screens instead. It feels more plausible.
Anyhow, I don't want to derail too much but I love to futurism thoughts. (and how most futurist over the history mostly got it all wrong because they weren't thought the futur with their head in the present)
 
E

Ethanicus

Guest
Actually I didn't have the chance to see the 48 fps version. Didn't have a theater near me that had it or didn't have time to do it. I am really deceived it didn't caught on.
I wonder if the DVD/BRD version plays at 48 fps? Does someone knows?

Anyways, I never seen anything NOT look and feel better at higher FPS so am sure Hobbit must be awesome really.
When I look at movies in theaters, I see how choppy the movement is and how a blurry mess action scenes are. Am really getting annoyed lately.
Mm, I think you're missing something here. You seem to assume FPS is a trained thing. I think the reason people prefer 24 over anything else in movies is because it most closely resembles actual real life motion blur, which you seem to think is horrible. Wave your hand past your face, it won't be crisp and clean, it blurs. If anything that means blur is more realistic.
I've seen an effect in many movies where they record the scene with high-framerate cameras and play it back at normal speed, eliminating blur and giving a bit of a "quick" look to everything. It looks good in action scenes, but kinda makes everything look cheap (objectively, not just trained, I think) and almost TOO real. Like I said, the Hobbit actually is famous for making people nauseated, even causing vomiting.
I don't think it's the framerate that's wrong, but as of yet you haven't really given any examples to what you're saying. I'd argue that in movies framerate isn't as big of a deal as it is a passive media, where input and responsiveness don't matter as much as it just looking right.
 

RangerX

Member
Blurring is my eye's job. When I watch something at 60fps, it blurs naturally -- just like real life. When at 24 fps, its alot more blur. My sight is immensely better than what a movie display to me. Look, this is something I feel and thought about since a LONG time. 24+blur is choppier than what my eyes see of "real life". Alot choppier. 60 fps is where I feel we hit "real life" sweet spot or is where my brain stops caring. You all see much "smoother" than 60fps guys, you just got accustomed to 24+blur.
I don't think anybody will have further real arguments. And people have and always will have movement sickness. Its not an argument to anything really. Those people having nausea probably have other issues. People are sick in cars, on boats, on a ferris wheel. This only have to do with how they brain analyse and feel movement and space. And the clearer image and better FPS you will have (the best output you will have) the more those people will feel their sickness. Doesn't mean clearer and smoother image shouldn't be. There's no way out. Some peep will always prefer choppy, whatever their reasons. Some people like me and alot of other "me's" in the world will want smoother.
 
E

Ethanicus

Guest
Blurring is my eye's job. When I watch something at 60fps, it blurs naturally -- just like real life. When at 24 fps, its alot more blur. My sight is immensely better than what a movie display to me. Look, this is something I feel and thought about since a LONG time. 24+blur is choppier than what my eyes see of "real life". Alot choppier. 60 fps is where I feel we hit "real life" sweet spot or is where my brain stops caring. You all see much "smoother" than 60fps guys, you just got accustomed to 24+blur.
I don't think anybody will have further real arguments. And people have and always will have movement sickness. Its not an argument to anything really. Those people having nausea probably have other issues. People are sick in cars, on boats, on a ferris wheel. This only have to do with how they brain analyse and feel movement and space. And the clearer image and better FPS you will have (the best output you will have) the more those people will feel their sickness. Doesn't mean clearer and smoother image shouldn't be. There's no way out. Some peep will always prefer choppy, whatever their reasons. Some people like me and alot of other "me's" in the world will want smoother.
Well assuming you're right, that would mean we have to start making movies at multiple FPS's to please multiple types of people. It sounds a little bit complicated what you're implying. And since obviously neither of us have provided proof to our points other than personal experience and conjecture, we're not really gonna reach an agreement.
 

GMWolf

aka fel666
Blurring is my eye's job. When I watch something at 60fps, it blurs naturally -- just like real life. When at 24 fps, its alot more blur. My sight is immensely better than what a movie display to me. Look, this is something I feel and thought about since a LONG time. 24+blur is choppier than what my eyes see of "real life". Alot choppier. 60 fps is where I feel we hit "real life" sweet spot or is where my brain stops caring. You all see much "smoother" than 60fps guys, you just got accustomed to 24+blur.
I don't think anybody will have further real arguments. And people have and always will have movement sickness. Its not an argument to anything really. Those people having nausea probably have other issues. People are sick in cars, on boats, on a ferris wheel. This only have to do with how they brain analyse and feel movement and space. And the clearer image and better FPS you will have (the best output you will have) the more those people will feel their sickness. Doesn't mean clearer and smoother image shouldn't be. There's no way out. Some peep will always prefer choppy, whatever their reasons. Some people like me and alot of other "me's" in the world will want smoother.
Wrong.
One word: tennis.

When tennis matches started being filmed at high frame rates (120fps), people complained about headaches when watching the match on 60fps monitors. Turns out it was because of reduced motion blur.
Now they add artificial motion blur to the ball to relieve headaches.

Turns out your eyes won't always blur out motion perceived motion.
 

Niels

Member
A the endless 30 vs 60 fps found the yoyogames forum ;)

As a primer I would do the following:
-if you are building a action game that needs a fast reaction time; go 60 fps
-making a strategic/puzzle/card game; 30 fps is enough
 

RangerX

Member
I still think the motion sickness arguments that is constantly pulled out here is irrelevant.
I know multiple people that can't play a first person shooter at 30fps because they have nausea and stuff. But they find with 2D games.
Do it means we should make 15fps games? Does it means 3D games should have a 2D version? The logics applied in this thread are wrong.
There will always be motion sickness and its irrelevant to the topic or a good reason not to improve visual fidelity.
 
S

seanm

Guest
Well there's a reason pretty much all esports pros use >120hz monitors now. More clarity, more available information, and less input lag.

As an audiophile myself I would argue 144hz is closer to "CD quality".
 
E

Ethanicus

Guest
Well there's a reason pretty much all esports pros use >120hz monitors now. More clarity, more available information, and less input lag.

As an audiophile myself I would argue 144hz is closer to "CD quality".
I'm definitely not arguing for less than 60 in games, it's perfectly good there.
But like Fel said, there's something about an object physically moving that causes optical motion blur which artificial movement just can't create. We just don't really understand why, but we can see it's true. Overwatch actually goes to lengths to erase motion blur, which is fine because it's a game and the blur would only make it look worse.
 
M

Misty

Guest
What I propose is the pendulum test. Video tape a pendulum in top half of a TV at high framerate. (The TV is a special TV where half of it is fake, just a glass window.)

Then get test subjects to look at the 2 at high fps, and guess which one is the real one. But when they guess, you never tell them if they are right or wrong. Reduce the FPS until they can correctly guess (with emphasis and clarity) which is the virtual pendulum, when they start to shout "I am certain that is the virtual pendum!". That is your minimum rate for feeling realism.
 

GMWolf

aka fel666
What I propose is the pendulum test. Video tape a pendulum in top half of a TV at high framerate. (The TV is a special TV where half of it is fake, just a glass window.)

Then get test subjects to look at the 2 at high fps, and guess which one is the real one. But when they guess, you never tell them if they are right or wrong. Reduce the FPS until they can correctly guess (with emphasis and clarity) which is the virtual pendulum, when they start to shout "I am certain that is the virtual pendum!". That is your minimum rate for feeling realism.
Yeah, but comparing like that is unfair.
Watch a 30 fps movie. You won't notice too much jitter.
Then, place a 60 fps movie right next to it. Right at the boundary, you will see a significant amount of jitter when the camera is panning, etc.

This is how a lot of the 30fps vs 60 fps YouTube comparisons are done, but the method is flawed.

Its better to first show one, then the other, and try to guess which is which.
 
M

Misty

Guest
Yeah, but comparing like that is unfair.
Watch a 30 fps movie. You won't notice too much jitter.
Then, place a 60 fps movie right next to it. Right at the boundary, you will see a significant amount of jitter when the camera is panning, etc.

This is how a lot of the 30fps vs 60 fps YouTube comparisons are done, but the method is flawed.

Its better to first show one, then the other, and try to guess which is which.
This isn't about comparing the quality of Youtube videos, its about determining which FPS is actually virtually indiscernable from real life.
 

GMWolf

aka fel666
This isn't about comparing the quality of Youtube videos, its about determining which FPS is actually virtually indiscernable from real life.
Yes, but side by side comparisons are not fair. Especially when its a single video with the left being 30 and right being 60. The jitter is far more noticeable that way.

For the experiment to be more valid, you would have to do tests one after the other, and ask which is smoother.

You would also have to ensure they only see from one eye to remove parallax.
 
M

Misty

Guest
Yes, but side by side comparisons are not fair. Especially when its a single video with the left being 30 and right being 60. The jitter is far more noticeable that way.

For the experiment to be more valid, you would have to do tests one after the other, and ask which is smoother.

You would also have to ensure they only see from one eye to remove parallax.
Dude...you didn't even get the left/right thing right about my experiment.
I said the experiment would be a vertical comparison...No left/right problem...I purposely made a vertical comparison so people wouldn't blame it on a left/right brain thing.
Second....where is this 30/60 thing coming from? My experiment is a real pendulum inside a TV, then a video of a pendulum that looks the same as the pendulum in the TV. Participants have to decide when the video is significantly more fake looking than the actual pendulum. And yeah, the "only one eye used" is a good idea.
 

GMWolf

aka fel666
Dude...you didn't even get the left/right thing right about my experiment.
I said the experiment would be a vertical comparison...No left/right problem...I purposely made a vertical comparison so people wouldn't blame it on a left/right brain thing.
Second....where is this 30/60 thing coming from? My experiment is a real pendulum inside a TV, then a video of a pendulum that looks the same as the pendulum in the TV. Participants have to decide when the video is significantly more fake looking than the actual pendulum. And yeah, the "only one eye used" is a good idea.
Its not a left right thing.
Its a side by side thing.
You can see the difference much more clearly if you can see both at the same time.

But what we are trying to test is fidelity of the image on its own, not when put side to side (top/bottom or left right) with a smoother image (60fps or real, doesn't matter)
 

GMWolf

aka fel666
Its not about that.
It was never about that. Otherwise, all games would try to run at 300fps or over!

Its about what fps makes sense for your game.
Are you making a strategy or puzzle game? 30fps is just fine.

Action platformer or casual shooter? 60-90 fps is pretty good.

Competitive game? 60 is a minimum, but it should run easily at 120+ on any reasonable machine.

Poker or card game? No need to even have frames in the first place!

In the end you will need to balance between time resolution (fps) image resolution and fidelity. Its a design decision to make like any other.
 
E

Ethanicus

Guest
While I think, if POSSIBLE, you should try for 60, if it just makes the job harder for no reason just don't bother.
 
Top