What is this obsession with framerates over 30FPS?

Recommended Videos

Zipa

batlh bIHeghjaj.
Dec 19, 2010
1,489
0
0
30fps is shitty on a PC monitor plain and simple, the technology in a monitor and a TV are completely different.

In a nutshell TVs use something called Interlaced video which without getting technical doubles the fps. Monitors do not use this so 30 fps looks terrible, low fps does not give the illusion of motion.


Plus this is 2012, anything console or desktop PC wise that is capable of playing games can run the majority of games at 60fps comfortably. There are a few exceptions (BF3 and Rage on Xbox 360)but for the most part its easily feasible. (especially on a PC which generally has more modern hardware than the consoles)

Oh and Movies are a totally different kettle of fish altogether so comparing them to a game is like comparing a jet engine and a normal combustion engine.
 

Neonit

New member
Dec 24, 2008
477
0
0
if you see no difference between 30 and 60 fps then i really am envious. i for one can VERY clearly see a difference. i am even willing to go as far as to disable shadows in many games to up my fps from 50 to 60 let stand from 30 to 60.

also, there is NO reason why it should be locked on the pc.
 

wintercoat

New member
Nov 26, 2011
1,691
0
0
TehCookie said:
wintercoat said:
TehCookie said:
There is a huge difference in the looks and smoothness of a game. If you don't think you can see a difference look at this: http://boallen.com/fps-compare.html

If you have a slower game it doesn't matter as much, but in fast paced action games it makes a world of difference. Especially when you have timing involved, more frames gives the developer more control over the timing involved in attacks/dodges/stuff.
I found the 15 fps example to be rather smooth, with no difference other than in speed for the 30 and 60 fps examples...I think I may be broken.

OT: Iunno OP, but if the above is anything to go by, I'm not exactly a good yardstick.
This may seem like a dumb question, but did you get the FPS OK in the corner? Either that or maybe you should get your eyes checked out.
Yup, it said fps okay. To be honest, it has been a while since I got my glasses prescription updated. Like...10 years. >.> Been meaning to, but i just haven't gotten around to it.
 

Adam Jensen_v1legacy

I never asked for this
Sep 8, 2011
6,651
0
0
SmashLovesTitanQuest said:
Movies and TV are different from video games. Your comparison is invalid.

/thread
Indeed. 24 fps in a video game is totally different than watching a movie. I didn't know that some people still don't understand that.
 

lokicdn

New member
Sep 10, 2010
46
0
0
Windknight said:
Ok, essentially, as I understand it, any frame-rate of about 10-20 or more is enough to provide an illusions of a moving picture. Indeed, movies and television have a framerate of 24 FPS, and no-one seems to find any problem with them being choppy or slow.

So why so much freakout at frame-rates being capped at 30 FPS, or this obsession with getting it up to 60? if you've surpassed the point needed to create the illusion of a fluid, moving picture, do you really need to push it even father? or is this some 'OMG GOTTA SHOW OFF MY HARDWARE POWER!' thing thats ost posing and showing off?
There is a substantial difference between visual persistance and the maximum rate at which the human eye can perceive individual movement. In any game that involves read and react (ie a shooter). Faster is obviously better.
 

BrotherRool

New member
Oct 31, 2008
3,834
0
0
chadachada123 said:
My current PC runs Minecraft at between 10 and 20 FPS, and I've been pretty good with that for, well, years, so anytime people complain about a 'mere' 30 FPS, I usually facepalm. First-world problems.

As long as it's over 30fps, it's perfectly acceptable, in my eyes. When I get my new laptop next week, with an awesome graphics card and plenty of memory, then I'll come back and say whether or not 30fps vs 60fps is objectively a big deal, or only an issue for videophiles that are just obsessing over getting things to look 1% better.

So to sum up, from my personal experience so far:

Going from sub-20fps to 24fps: pretty big difference.
Going from 24fps to 30fps: a big difference.
Going from 30fps to 40fps: not a big difference.
Going from 40fps to 60fps: barely noticeable difference, and nothing to ***** about.
I believe there isn't actually a physical difference between 30FPS and 59FPS when you're playing games. I think it;s because it has to sink up with the refresh rate of the screen and eye. I can't find where I read that though
 

Paragon Fury

The Loud Shadow
Jan 23, 2009
5,161
0
0
The big debate over 60FPS is not if 60FPS is better or not, but rather if the loss of gameplay and immersion elements required to reach 60FPS on consoles is worth it.

Fro example, I can run Battlefield 3 on High with 60FPS on my computer - a damn good experience. However my 360 can only run the game at 30FPS and lowered graphical settings - its capable of pushing to 60 FPS, but in order to do so you'd have to strip out gameplay and immersion elements like destruction.

Or like the eternal debate over Call of Duty. They keep making one of their selling points that "Their game runs at 60FPS on all consoles" and that its some huge advantage. The counterpoint that many of us keep making and the both Treyarch and Infinity Ward keep refusing to answer is that many systems could run much harder to run games at 60FPS anyway, and that to keep that precious 60FPS, CoD is ignoring and refusing to add anything that would make the game better, like actual physics or even minor destruction.

Bungie (the makers of Halo) put it best (I'll paraphrase, since it was a long post): Systems like the 360 are perfectly capable of running games like Halo 3 and Reach at near 45 or 60FPS, but it causes stability issues and requires sacrifices in graphics and gameplay systems. Ultimately, we'd rather have a game that runs stably at 30FPS with its gameplay intact than at 60FPS with things stripped out of it.
 

teebeeohh

New member
Jun 17, 2009
2,896
0
0
24fps in movies and tv is not an argument.
it was the cheapest way to do movies with sound when that as novel and for some reason that was never changed.
i just hope the hobbit being shot in higher fps may put an end to that.
also: everything below 20 is shit, above 30 is ok and you only really need 60 when playing something very fast.
 

WoW Killer

New member
Mar 3, 2012
965
0
0
While 30 to 60 is noticeable, it isn't that big a deal to me. Anything over 60 I don't believe there's any difference at all. Maybe my eyes are screwed. But generally the higher it is the more I prefer it, because as soon as I switch Fraps onto record I know that frame rate is dropping by a factor of three or four, and that's a difference I really can tell.
 

dillinger88

New member
Jan 6, 2010
133
0
0
It really grinds my gears when people compare movies and games when it comes to frame rates.

A camera gathers light for the entire time the shutter is open so if there is any fast movement there is some "blurring". This makes the frame transitions smoother in film so 24fps is fine. In animated films, they will apply motion blur in post.

Games, however, render a single image per frame so if there are large movements it looks stuttered at low frame rates. Therefore you need higher frame rates for it to look smooth. Motion blur doesn't work so well here because games are real time and it has detect movement very quickly and apply the blur before the frame is rendered.

There is a difference. After playing games for so long at 60fps, I can see when something runs slower, even without a side-by-side comparison. It's just what I'm used to.
 

mishagale

New member
Sep 22, 2009
77
0
0
The thing about ~24fps that we see on TV and in the cinema, is that it actually is kinda too slow. You can't do a very fast pan with 24fps because there aren't enough frames to show the changing image. A lot of games, that wouldn't be a problem, but in just about any modern FPS, you need to be able to turn your "camera" very quickly, and if the view appears to stutter when you do, it could get you shot.
 

RedLister

New member
Jun 14, 2011
233
0
0
My eyes are crap so i don't always notice much of a difference between say 40FPS to 60FPS. doubt my old GeForce 9500GT could run much at 60FPS anyway.
 

The Heik

King of the Nael
Oct 12, 2008
1,568
0
0
Daystar Clarion said:
Yeah, some of the combos are insanely tight in that game, you'd never pull them off at 30FPS.
I call bullshit on this statement.

At 30 FPS, there is .03 seconds between each frame. At 60 FPS there are .0167 seconds between frames. That's already a pretty miniscule difference, but it's made all the more irrevelent mechanically by the fact that the world's fastest human reaction times are .100 seconds, 6 times slower than the frametime at 60 FPS. If the the difference between 30 and 60 FPS is a workable amount for you, then congratulations you're officially superhuman, but no regular member of the homo sapiens species could actually utilize such a meager amount of time with any significant measurable difference in player capability.
 
Dec 14, 2009
15,526
0
0
The Heik said:
Daystar Clarion said:
Yeah, some of the combos are insanely tight in that game, you'd never pull them off at 30FPS.
I call bullshit on this statement.

At 30 FPS, there is .03 seconds between each frame. At 60 FPS there are .0167 seconds between frames. That's already a pretty miniscule difference, but it's made all the more irrevelent mechanically by the fact that the world's fastest human reaction times are .100 seconds, 6 times slower than the frametime at 60 FPS. If the the difference between 30 and 60 FPS is a workable amount for you, then congratulations you're officially superhuman, but no regular member of the homo sapiens species could actually utilize such a meager amount of time with any significant measurable difference in player capability.
Which would be a valid criticism if combos were down to individual frames, which they're not.
 

Jamash

Top Todger
Jun 25, 2008
3,638
0
0
I suppose a lot of it is down to personal preference and the the differences between what people are use to, what individuals can perceive and what they think they can perceive.

If someone is adamant that they can't tell the difference between 30fps and 60fps, then that's great for them and their enjoyment of media.

Likewise, if someone is adamant that they can tell the difference between 30fps and 60fps, then obviously it is more important to them and mustn't be dismissed as mere graphics whoring.

However, there is a very noticeable and undeniable difference between 24fps, 30fps and 60fps, but you may not actually be aware of it unless you have a side by side comparison (you could play many different isolated games without being aware they run at different frame rates).

This tool is the best comparison of frame rates I've come across, as it allows you to change the frame rates of different objects (and add more objects) within the same image, giving an unrivalled comparison:

http://frames-per-second.appspot.com/

Personally, I'm fine with whatever frame rate a game throws at me, as long as it's consistent. I'll happily play a game at 30fps as long as it's 30fps at all times and while 60fps is obviously smoother and better for gameplay (especially racing games), that 60fps is worthless if it slows down at any point. I'd rather have a game capped at a consistent 30fps than a game that runs at 60fps but will suffer the occasional slow-down.
 

Vivi22

New member
Aug 22, 2010
2,300
0
0
chadachada123 said:
My current PC runs Minecraft at between 10 and 20 FPS, and I've been pretty good with that for, well, years, so anytime people complain about a 'mere' 30 FPS, I usually facepalm. First-world problems.
Play something a bit more action focused than Minecraft and you might notice why people want more than 30fps. Hell, getting below 15 or so is reaching the point where the human eye can pick out the individual frames, and action oriented games at frame rates below 30 are frustrating (dropped frames could mean death in that case). Below 20 and you're basically talking about something that's unplayable if it's got even a reasonable pace.

So to sum up, from my personal experience so far:

Going from sub-20fps to 24fps: pretty big difference.
Going from 24fps to 30fps: a big difference.
Going from 30fps to 40fps: not a big difference.
Going from 40fps to 60fps: barely noticeable difference, and nothing to ***** about.
Have you actually played games that ran between 30-60fps? I only ask because while the difference between 30 and 60 may not be as noticeable as say the difference between 10 and 30 (by simple virtue of the fact that the latter is the difference between seeing individual frames and not), it is absolutely quite noticeable none the less. 60fps is much smoother, results in fewer dropped frames, and things like screen tearing from fast movement become less of an issue, if not disappear completely, at fps higher than 30.

Honestly, it's nice that most companies at least focus on a bare minimum playable goal of 30fps and all. But I'd like to see the standard bumped up to 60. On the PC it doesn't matter as much since settings are adjustable. But on console it'd be a nice touch, and yes, it really is so noticeable that I'd like to see every game run at that speed.
 

More Fun To Compute

New member
Nov 18, 2008
4,061
0
0
The Heik said:
Daystar Clarion said:
Yeah, some of the combos are insanely tight in that game, you'd never pull them off at 30FPS.
I call bullshit on this statement.

At 30 FPS, there is .03 seconds between each frame. At 60 FPS there are .0167 seconds between frames. That's already a pretty miniscule difference, but it's made all the more irrevelent mechanically by the fact that the world's fastest human reaction times are .100 seconds, 6 times slower than the frametime at 60 FPS. If the the difference between 30 and 60 FPS is a workable amount for you, then congratulations you're officially superhuman, but no regular member of the homo sapiens species could actually utilize such a meager amount of time with any significant measurable difference in player capability.
That's not how it works in terms of the whole system. A very finely tuned 60fps game has around 66ms of input latency while solid 30fps has twice that although it often goes up to over 200ms if the engine is not tuned and the hdtv is laggy or whatever.

And also in fighting games losing every other frame of animation is losing a lot of data about what is happening. Is this move x or move y, how obvious is the difference and so on. With 60fps developers can convey the same amount of visual information to the player in a shorter amount of time making action games feel a lot faster.
 

remketh

New member
Oct 11, 2010
8
0
0
its true that you cant really tell the difference over 30 fps, however you must remember the number displayed is an average, so lets say you normally run at 60 fps and then for the first 1/2 a sec it cant render the screen quick enough and only renders 5 frames in that first half a second and then manages to render 30 in the second half, you actually see oh my fps dropped to 35 which means I still shouldn't be able to detect it, what you do detect is the shit performance in that first half of the second. Having the capability to run at higher fps makes increasingly unlikely for this to occur.