What is this obsession with framerates over 30FPS?

Recommended Videos

More Fun To Compute

New member
Nov 18, 2008
4,061
0
0
The Heik said:
And you have a very strange and limited understanding of what reaction time means at all. It's definition is not something up for debate. Reaction time is and has always been defined along of the lines of "the time elapsed between identification of stimuli and an appropriate response."

Also you seem to have completely ignored my post where I explained why I used the .1 second threshold (which not only were you were quoted in, but quoted me from, so it mystifies as to how you missed it). But I suppose I'll have to explain it again. I use that benchmark because it is literally the fastest reaction time known to man. That was the only reason I used that number, as I was using the parameters for the best possible situation. My results were optimistic, so in truth the realistic RTs would be far worse off than stated, thereby making any bout between two foes matter more on basic reaction time as the difference in frame rate between 30 and 60 is even less of a percentage of the amount.

Now if you like games to be 60 FPS, that's perfectly fine. It does add to the visual polish of the game, and there are certainly games where it is most appreciated (Skyrim with all the mods and 60 FPS is gorgeous). But that doesn't ignore the fact that my original point of 60 FPS not making any useable difference in terms of player capability still holds up.
It's a waste of time trying to educate you. You think that you already know everything and keep on trying to explain elementary things to me that I have already made clear that I understand. Try to explain how things work and you say I'm just trying to over complicate it!
 

More Fun To Compute

New member
Nov 18, 2008
4,061
0
0
Starik20X6 said:
My understanding is that the human eye runs at around 72fps, so anything higher than that is a bit of a waste. 24fps is most likely just a holdover from the days of analog film, with 24fps being enough to create fluid movement without the cost of more film becoming overwhelming. Now we've gone almost all digital, we could conceivably upgrade everything to a faster frame-rate, I suppose it just takes more processing power. So yeah, theoretically the faster the better, but after about 72 there's not much point because you physically can't see faster than that.
The human eye does not run at any framerate. Some visual changes we really slow to pick or filter out completely and other changes like a brief flash we can detect if they last for a shorter time than 1/1000 of a second. And no two people are the same because we are not mass produced in a factory.
 

legendp

New member
Jul 9, 2010
311
0
0
Windknight said:
Ok, essentially, as I understand it, any frame-rate of about 10-20 or more is enough to provide an illusions of a moving picture. Indeed, movies and television have a framerate of 24 FPS, and no-one seems to find any problem with them being choppy or slow.

So why so much freakout at frame-rates being capped at 30 FPS, or this obsession with getting it up to 60? if you've surpassed the point needed to create the illusion of a fluid, moving picture, do you really need to push it even father? or is this some 'OMG GOTTA SHOW OFF MY HARDWARE POWER!' thing thats ost posing and showing off?
If you have ever played a fps at 30fps on console and then 60fps on PC then the question answers itself (pc feel hugely smoother). It may be to do with the fact we are not interacting with a movie, A game we are. Compare something like rage (runs at 60fps on console) to mass effect 3 (runs at 30fps console), they both have demo's, and you will feel the difference. This difference is even more noticeable when using a mouse. I can personally say that I can even tell the difference between 60fps and a 100fps. also movies are moving up to 30fps and now the hobbit being at 48fps.

Another thing is dips in fps (when a massive explosion happens). your framerate may drop from 30fps to 20fps and that feels bad, but on PC it may only dip to 40fps.

I would suggest trying the halo 1 CE pc demo, Use the 30fps cap and then turn it off and you will tell a huge difference in smoothness (The animations though in halo ce look dodgy at 60fps because they where created for 30fps, that is why the cap is there)

http://download.cnet.com/Halo-Combat-Evolved/3000-7441_4-10235611.html

(and use fraps to make shore youre getting 60fps), some people even get 120fps monitors because they do feel smoother.

Trying to describe how much better 60fps is than 30fps is like trying to describe colour to someone who is (and always has been) blind, you have to experience it. and you cannot use youtube because that runs at 30fps. A good example is your operating system. operating systems run at 60fps becuase below they they feel laggy (mouse cursor feels jaggy (it also started with how CRTs couldn't properly display lower framerates).

Edit 1: I use to be able to play games at 20fps, but after playing them at 60fps I struggle to hit the target at 20fps. It's similar to going from 0 degrees to 10 degrees, you will think 10 degrees is very warm, but then go up to 20 degrees and now 10 degrees seems cold. same principle with games, once you have gotten use to 60fps you think 30fps feels sluggish, admittedly consoles as is use tricks to make there 30fps feel smoother (and lower frame-rates isn't as noticeable on a controller)
 

Crazy Zaul

New member
Oct 5, 2010
1,217
0
0
Playing at 30 FPS is fine if there's no comparison. I never got more than 30 in wow on my old PC and it was fine. The problem is once you can get up to 60 and it goes down to 30 (My max frame rate alternated between 60 and 30 with each patch - it was weird) i feels soooo slooooow. Moving slower across the map when flying, running feels like running through glue. FPS is all relative.
 

Yopaz

Sarcastic overlord
Jun 3, 2009
6,092
0
0
chadachada123 said:
Yopaz said:
10-20 fps is enough? Yeah... I take it you have never actually played games at that framerate. Anything below 30 is horrible and makes games feel a lot slower and unresponsive. I don't care much as long as it's somewhere between 40-50 and if it's stable.
I take it you were quoting me, since I noted that 10-20fps is legitimately what Minecraft runs at for me, even on tiny draw distance with fast graphics, smooth lighting off, max FPS setting and with lowered particles (that is, lowest settings aside from the second-lowest particle setting).

It's playable, and that's what I've gotten used to since I have no other option. I've put in at least a hundred hours playing like this, including on survival and multiplayer maps.

I just opened up Minecraft and took a screenshot from my new world with the stats-screen open. It was chugging at between 7 and 13 fps, and seemed to be what it normally was when I was in a new or heavily-altered area.

Here it is, and I will willingly provide more examples that have Mozilla closed (despite having only this tab open and comparatively little memory usage from Mozilla atm), if you so prefer:


My laptop just plain sucks, but I'm pretty used to it, *shrug.*
No that was directed at the text in the OP where he stated that there's no need for more than 10-20. However on what you said I would like to add that for games like Minecraft 20 fps works since there's not that much fast paced action. However it's horrible in most games and some games are pretty much impossible to enjoy with low fps.
 

chadachada123

New member
Jan 17, 2011
2,310
0
0
Yopaz said:
No that was directed at the text in the OP where he stated that there's no need for more than 10-20. However on what you said I would like to add that for games like Minecraft 20 fps works since there's not that much fast paced action. However it's horrible in most games and some games are pretty much impossible to enjoy with low fps.
Oh, my. I just went back and looked at OP's post and realized that my own post on the previous page mirrors him almost word-for-word in a couple of spots.

My apologies, you'll hear no disagreement from me about how terrible 20fps is for some genres.

Captcha: jump the gun. I've got to stop doing that.
 

Yopaz

Sarcastic overlord
Jun 3, 2009
6,092
0
0
chadachada123 said:
Yopaz said:
No that was directed at the text in the OP where he stated that there's no need for more than 10-20. However on what you said I would like to add that for games like Minecraft 20 fps works since there's not that much fast paced action. However it's horrible in most games and some games are pretty much impossible to enjoy with low fps.
Oh, my. I just went back and looked at OP's post and realized that my own post on the previous page mirrors him almost word-for-word in a couple of spots.

My apologies, you'll hear no disagreement from me about how terrible 20fps is for some genres.

Captcha: jump the gun. I've got to stop doing that.
No worries, I should have made my point more clearly than I did and mentioned that there are exceptions to the fps. I also like to point out that as long as the fps doesn't have rapid drops even a low one may be acceptable.
 

Pyro Paul

New member
Dec 7, 2007
842
0
0
Lee Quitt said:
Pyro Paul said:
Dexter111 said:
You are completely wrong on all counts xD ...
am i?

http://www.hardwarecanucks.com/charts/cpu.php?pid=69,70,71,76,77&tid=2

http://www.geeks3d.com/20110331/crysis-2-43-graphics-cards-compared/

Then why does every one Benchmark things like CPUs and GPUs using the FPS of a high stress game?
The idea that the human brain cannot see more then 24 - 30 fps is actually a very bad, yet kinda common mistake. Its actually well over 100, and there is still debate about when you start to get diminishing returns.
Acctually the human eye isn't a camera so it doesn't see in FPS, but instead the chemical reactions with in the retna and brain. The human eye doesn't see 'frames' but instead sees 'change'.

Hell, technically the human eye can detect visual discrpencies and 'jerkiness' film taken in 300 fps... but does that mean the human eye 'sees at 300 fps'?

no, it is because the frames are taken at such a high rate that individual exposure prevents 'blur' and as such, the human eye can detect the movement of the objects filmed and the lack of information between the frames.

With the implementation of artifical blur in games, this even more-so throws the 'fps' out the window because 30fps with blur acctually looks smoother and more fluid then 60 fps with out.

FPS is only used as a bench mark to test the strength of a rig when running high stress things.
 

veloper

New member
Jan 20, 2009
4,597
0
0
It saddens me to see all these made-up theories about the human eyes and brain still being thrown around, when everybody here has the means to test the real thing, up to atleast 60hz.

Any old low-budget TN screen can do atleast 60hz. You don't need a beefy PC to spin a simple 3D block at 60fps either.

The vast majority of you will be able to see the difference up to 60hz and even see the jumps between frames at 30 hz.

If you still keep a CRT around then most of you can even demonstrate the differences well beyond 60 frames per second. I could test and see the difference between 90 and 120 fps even. So could 3 out of 3 observers with me.

So set up a proper test and then you can all rewrite your silly theories to atleast fit the empirical data.
 

Tyler Trahan

New member
Sep 27, 2011
44
0
0
I didnt think anything of this until I got a hold of a high powered PC that was HD compatible. I started playing a game I've played before and all of a sudden I was going "Oh my god whats going on? Everything is so...smooth..." as in I had to get used to how my mouse moved it was so smooth type of smooth. It may not be a huge technical difference, but your eye certainly notices.
 

GundamSentinel

The leading man, who else?
Aug 23, 2009
4,448
0
0
SmashLovesTitanQuest said:
Movies and TV are different from video games. Your comparison is invalid.

/thread
Two things don't have to be the same to make a valid comparison. That's what comparisons are for, to see what is the same and what is different. Your argument is invalid.

OT: Personally, I've played enough games on horrible PCs to be perfectly happy when playing a game at 15 FPS. But as I can perfectly see the difference between 30 and 60 FPS, I do understand how other people might feel differently.

Still, the visual difference between 10 and 20 FPS is far bigger than between 20 and 30, so I unless you're a professional gamer I don't see a point in continued insistence on games with absurdly high framerates. What's the use of 120 FPS anyway?
 

DoPo

"You're not cleared for that."
Jan 30, 2012
8,665
0
0
GundamSentinel said:
SmashLovesTitanQuest said:
Movies and TV are different from video games. Your comparison is invalid.

/thread
Two things don't have to be the same to make a valid comparison. That's what comparisons are for, to see what is the same and what is different. Your argument is invalid.
Why do you eat at all? Plants only need to photosynthesise, so why do you keep wasting the planet's resources?

That's what OP sounds like.
 

GundamSentinel

The leading man, who else?
Aug 23, 2009
4,448
0
0
DoPo said:
GundamSentinel said:
Why do you eat at all? Plants only need to photosynthesise, so why do you keep wasting the planet's resources?

That's what OP sounds like.
OP genuinely thinks that film and TV are bound to the same requirements as games. You and I and other people know that this isn't the case. The two are compared and the difference is explained to him. OP learns something, case closed.

My point is that basically saying 'OP doesn't know what he's talking about. /thread' doesn't really help answering OP's question.
 

The Heik

King of the Nael
Oct 12, 2008
1,568
0
0
Dexter111 said:
Rendering Pipeline Lag (PCs/Consoles take time to process information in the frame buffer and send it to a display device) - This is also dependent on Frame Rate, at 60FPS for instance the minimum theoretical Input Lag is ~17ms, if it drops down to 30FPS the minimum is 32ms, at 120FPS this delay is theoretically smaller

Input Lag - Mouse/Controller - this is usually negligible at somewhere around ~10ms, sometimes less, sometimes more. On Macs for instance there was an issue that caused 32ms of Mouse Lag: http://d43.me/blog/1205/the-cause-for-all-your-mac-os-x-mouse-annoyances/

Then there's just engines not being optimized or devs not caring much about latency that add on top of that.

There is an interesting article about The initial Hardware Lag for consoles here on Digital Foundry: http://www.eurogamer.net/articles/digitalfoundry-lag-factor-article and it can vary from 70-200ms alone (which comes rather close to your reaction time, and this is before we apply the next two factors)

Here's a final list of all the games I tested for this feature. Not all made it into the videos, so this handy table represents all of my findings. Probably the biggest surprise after GTA was the amount of lag built into LEGO Batman - 133ms on a 60FPS game. What is important to note is that these findings are very context-sensitive. Yes, COD4 appears to be more responsive than World at War, but in different selections of levels per game you could easily reverse that. In this respect, these results do have an element of randomness about them, though it is no secret that for the most scenarios, COD4 does outperform its pseudo-sequel.

Game - Latency Measurement

Burnout Paradise - 67ms
BioShock (frame-locked) - 133ms
BioShock (unlocked) - as low as 67ms
Call of Duty 4: Modern Warfare - 67ms-84ms
Call of Duty: World at War - 67ms-100ms
Call of Juarez: Bound in Blood - 100ms
Forza Motorsport 2 - 67ms
Geometry Wars 2 - 67ms
Guitar Hero: Aerosmith - 67ms
Grand Theft Auto IV - 133ms-200ms
Halo 3 - 100ms-150ms
Left 4 Dead - 100ms-133ms
LEGO Batman - 133ms
Mirror's Edge - 133ms
Street Fighter IV - 67ms
Soul Calibur IV - 67ms-84ms
Unreal Tournament 3 - 100ms-133ms
X-Men Origins: Wolverine - 133ms

In-game latency, or the level of response in our controls, is one of the most crucial elements in game-making, not just in the here and now, but for the future too. It's fair to say that players today have become conditioned to what the truly hardcore PC gamers would consider to be almost unacceptably high levels of latency to the point where cloud gaming services such as OnLive and Gaikai rely heavily upon it.

The average videogame runs at 30FPS, and appears to have an average lag in the region of 133ms. On top of that is additional delay from the display itself, bringing the overall latency to around 166ms. Assuming that the most ultra-PC gaming set-up has a latency less than one third of that, this is good news for cloud gaming in that there's a good 80ms or so window for game video to be transmitted from client to server.

But in the meantime, while overall "pings" between console and gamer remain rather high, the bottom line seems to be that players are now used to it, to the point where developers - like Infinity Ward - centred on getting the very lowest possible latencies are using that to give their games an edge over the competition. Call of Duty's ultra-crisp response is one of the key reasons why it's a cut above its rivals, and it's a core part of a gameplay package that will once again top the charts this Christmas.
Display Lag - CRTs didn't really have any, LCDs often using things like Noise Reduction, any sort of Image Processing, Scaling or Correction can add a lot of Lag, which can be anywhere around 20-30ms or more depending on configuration, there's also the initial pixel response times that manufacturers usually report to be added.

Online Gaming - Network Lag/Connection Delay let's say 40-50ms in some of the best cases

You can't simply look at this from an "academic" standpoint saying that there is 0ms Hardware Lag between what is being displayed and the user and further more that there are no psychological and cognitive effects in play from something going smoother and not appearing to be stalling because that's simply not the reality, and any part of the chain that decreases Lag is preferable in the first place, having higher FPS can cut down latency in the rendering pipeline (e.g. for 60FPS to ~17ms from over 32 for 30FPS) and having 120FPS can cut it down further to around ~8ms, which won't really be noticeable anymore under any circumstances.

And again... I've done this stuff, I have played games using a CRT monitor (with literally no lag) a lot in the past, and believe me there's a damn clear/noticeable difference there in moving your mouse around/aiming or trying to make that jump with an extremely smooth 120FPS over even 60FPS.

Please stop spreading misinformation and read up on the issue or grab yourself a CRT, install Quake 3 or similar and try it yourself, it's always the best experience xD

As More Fun To Compute said, any significant decrease in latency in that chain will give whoever a perceivable advantage over another player (because he will be the first to shoot/click/attack/whatever), this isn't true for all kinds of games but those based on reaction times as people have said fast-paced First Person Shooters and Fighting games especially or all these games where you're supposed to press buttons quick on a controller.
Dexter, I'm not saying that all possible latency issues aren't important. They do change the outcome, but that's kinda what my point is about. Between all the various possible sources of latency, the difference from 30 to 60 or from 60 to 120 FPS is like dropping a pebble into the ocean.

Let's run a few of the numbers here. Assuming that we're talking about a console with an LCD tv playing an online match of CoD4

70-200ms (console hardware latency)
20-30ms (screen latency)
40-50ms (online connection lag)
67-84ms (CoD4 latency)
225-250ms (average human reaction time)

If we add up the highs and lows of the parameters we can get between 419ms latency (fast end) and 634ms (slow end). Now if we add the 33ms latency for 30 FPS at the fast end of latency we get a total of 453ms. The latency difference between 30 FPS and 60 FPS is 17ms (rounding from the actual 16.67ms) which when divided by the base amount of 453

17/453=.0375

Not even four percent difference between 30 and 60. That's a negligible amount of combat difference between two players, all other things being equal. That advantage gets even smaller from 60 to 120 FPS, the difference between is 8ms (again after rounding from 8.33 ms) divided by 436 (fast end with 60 FPS latency of 17ms)

8/436=.0183

Not even a two percent difference. That's getting ridiculously tiny (and I'm not even sure if 120 FPS has even been done on a console, so it might even be a moot point). And this is at the best possible situation considering the parameters. In your average situation the difference may even drop into fractions of percentages, so with such a small difference and individual player skill and judgement the situation is entire based on the exact circumstances of the two opponeents meeting.

For example's sake, lets say I'm playing CoD4 at 30 FPS, and I face someone who's playing it at 60 FPS . Now he sees me 17ms before I see him, giving him that 3.75% edge. However, when I see him I decide to do the "drop and pop" technique of going full prone whilst firing my weapon (a common tactic in the CoD games) that gives a far larger margin of time than him as he has to adjust his aim. Therefore I'm given more time to shoot him, and I will most likely beat him due to simple tactics.

So you see, that difference doesn't really mean much because circumstance (the aforementioned sources of other latency and the play situation) and player judgment consistently trumps such a small advantage. While individuals with a really good tech set up and the 100ms god-child reflexes might be able get some mileage out of the 17ms at 60 FPS, to 99.9% of the world's population it's just not going to matter in a practical sense.

Now that's not saying that higher frame rates don't have their place in gaming. For visually heavy games it's pretty much a necessity to get the full richness, and less latency does add to the game's immersion. But from a purely mechanical standpoint it's not really going to add much unless you're a gifted hardcore gamer who wants to meta-game as much as possible (such as an MLGr).
 

DoPo

"You're not cleared for that."
Jan 30, 2012
8,665
0
0
GundamSentinel said:
DoPo said:
GundamSentinel said:
Why do you eat at all? Plants only need to photosynthesise, so why do you keep wasting the planet's resources?

That's what OP sounds like.
OP genuinely thinks that film and TV are bound to the same requirements as games. You and I and other people know that this isn't the case. The two are compared and the difference is explained to him. OP learns something, case closed.

My point is that basically saying 'OP doesn't know what he's talking about. /thread' doesn't really help answering OP's question.
While this may be true, I still think Smash is correct. OP showed arrogance, so it's fair to shut him off. And OP showed lack of knowledge of the matter, so shutting him off with that might just prompt him to do a tiny research next time[footnote]However, seeing as this is the Internet, I don't actually expect it to happen, but it's a nice thing to look forward to[/footnote]. Finally, since the thread feels like a flamebait (even an unintentional one) Smash's response is appropriate enough.
 

GundamSentinel

The leading man, who else?
Aug 23, 2009
4,448
0
0
DoPo said:
GundamSentinel said:
While this may be true, I still think Smash is correct. OP showed arrogance, so it's fair to shut him off. And OP showed lack of knowledge of the matter, so shutting him off with that might just prompt him to do a tiny research next time. Finally, since the thread feels like a flamebait (even an unintentional one) Smash's response is appropriate enough.
Personally I don't read much arrogance in OP's post. Ignorance, yes, but that's why people ask questions. The Escapist forums are as good a place as any, as far as I'm concerned.

You think differently, fine. I suppose we must agree to disagree on this then. :D