What is this obsession with framerates over 30FPS?

Recommended Videos

Vegosiux

New member
May 18, 2011
4,381
0
0
Thomas Hardy said:
Also, you're neglecting the point that I'm talking about the MOST RECENT information you've seen, not your reflexes or your connection speed. You will still be working from an older photograph than at least some of the other players. Its a miniscule difference but in no way does anything other than a faster frame rate reduce that miniscule increase in time. Talk "margin of error" all you want, its still there. I agree its a miniscule amount of time and not worth worrying about unless you're a professional-level player, but its still there.
Well, I hope those professional level players don't blink then, because that lasts about 0.1 seconds. That's what I've been trying to say, the difference between frame refreshes are not just miniscule, they're negligible, because other factors cause delays long enough for that frame to not make much of a difference at all. Or in other words, if you miss an ultimate and blame your frame rate, you're just being dishonest with yourself, unless it's really abysmal. Nobody has reaction times in single digit miliseconds where the difference between 30 and 60 FPS actually matters.

It's on the same principle you don't measure a football field to the accuracy of a micrometer. The digits beyond a certain point become insignificant to any given observation.
 

crystalsnow

New member
Aug 25, 2009
567
0
0
Looking and PLAYING at different FPS speeds are two completely different things and you completely fail to grasp that concept.

Television and movies may indeed lock at 24 fps, but that 24 fps also never changes. The motion is a constant, fluid 24 fps which means you won't ever be distracted by frame rate fluctuations.

If you watch gameplay footage however, you will notice when a game enters an area that 'lags' and has lower framerate. When PLAYING a game through said areas, this effect is amplified greatly, as you are controlling what is happening on screen. If the framerate is slowed, the input is also slowed, and the difference is incredibly noticeable.

Which brings us to the difference between 30 fps and 60 fps. When a game that averages at 30 fps drops in framerate, the difference can be pretty staggering, as even 20 fps feels incredibly slow compared to the higher 30 fps. Whereas if you experience a drop in frames at an average of 60 fps to say 50 fps, the difference, while still noticeable, is much less so.

The reason for this has to do with the ratios. Even though both cases involved the framerate dropping by 10 fps (30->20 / 60->50), the 60 fps dropped by a lower percentage, and is therefore less noticeable.

Even if you were getting 180 fps (which is pointless because of refresh rates), you would still notice drops in fps, although they too follow the same logic stated above.

And ohgodwhydidibothertoexplainthispeoplearesostupid
 

Zack Alklazaris

New member
Oct 6, 2011
1,938
0
0
Windknight said:
Ok, essentially, as I understand it, any frame-rate of about 10-20 or more is enough to provide an illusions of a moving picture. Indeed, movies and television have a framerate of 24 FPS, and no-one seems to find any problem with them being choppy or slow.

So why so much freakout at frame-rates being capped at 30 FPS, or this obsession with getting it up to 60? if you've surpassed the point needed to create the illusion of a fluid, moving picture, do you really need to push it even father? or is this some 'OMG GOTTA SHOW OFF MY HARDWARE POWER!' thing thats ost posing and showing off?
I guess some people find it a bragging rights performance thing? I remember back when people would piss themselves if they got Crysis to run at 50fps.

Technically anything above 30fps is more than enough because your eyes process at 24fps. As in it can still see semi-choppiness at 24 frames because your eyes are going at the same rate. At 30 you really can't tell... the image will remain fluid and lifelike, well maybe if you blink really fast you'd see it.

Now keep in mind this is if the CAP is 30fps as you mentioned. A game with no cap that runs at 30fps on average can drop significantly. If a bunch of bombs drop and Ninja Nazis start zipping around the screen the frame rate could drop to 10fps, which you definitely would notice.
 

sanquin

New member
Jun 8, 2011
1,837
0
0
Lee Quitt said:
Apparently the hardcore fpsers use 120 hz monitors so that 120 fps can actually be rendered at the same time, or something to that effect. According to some of them it makes a world of difference.
If that's true the difference is still a lot less noticeable than between 30 and 60 I figure. As I don't believe the human eye can see all the way up to 120 fps.
 

ReadyAmyFire

New member
May 4, 2012
289
0
0
As much as I didn't like to admit it at the time I do see a difference between 30 and 60 frames. Not so much in an RTS or console FPS. But with something like a flight sim or PC FPS I find the extra fluidity helps greatly.
 

Joccaren

Elite Member
Mar 29, 2011
2,601
3
43
Beyond the fact that TVs and monitors and such display things differently, and thus 30 FPS on a TV isn't as bad as it is on a monitor, a few reasons:

1. There is a visible difference, and saying its not important is like saying HD and SD isn't important. If you've played 30 FPS all your life, or seen only SD screens, and then look at 60 or HD screens, you'll go "Yeah, its a little better, but not that much. You're just a graphics whore". However, go at 60 FPS or HD for a year and swap back to 30 FPS or SD and you'll notice a massive jump. It looks worse because you're used to better. Of course, there are those that say they can't tell the difference between SD and HD and 30 and 60 FPS. I'm not going to dispute whether or not the claims are true, but simply because you can't doesn't mean nobody can.

2. Input lag. Yeah, its small. Its also noticable, just like the FPS, if you're used to less/a higher FPS. At 30 FPS I can subconsciously feel a delay between me inputting the action, and what happens on screen, and that throws me out of sync. I'm used to playing at 60FPS and the delays carried therein. At 30 FPS, its not what my reflexes and subconscious are used to, and it stuffs itself up. The same thing happens to me if I play at 90-120FPS in shooters then drop down to 60, though to a lesser degree. Its what you're used that will likely determine your stance on 30 and 60 FPS.
 

nexus

New member
May 30, 2012
440
0
0
sanquin said:
Apparently the hardcore fpsers use 120 hz monitors so that 120 fps can actually be rendered at the same time, or something to that effect. According to some of them it makes a world of difference.
I really do not see the point of going over 60hz for gaming. It's true, anything above 60 and you won't be able to see any difference.. I mean, I guess it's possible some people can markedly see a difference, but I've never heard of it.

The only reason -most- people get a 120hz monitor is so they can do 3D gaming. You need 120hz so the scene can be split and rendered twice, at 60hz / 60hz. You can't split 60hz for 30hz / 30hz as it would be too slow and wouldn't render fast enough. Food for thought in the 30 vs 60 argument.
 

jklinders

New member
Sep 21, 2010
945
0
0
If you are only sitting at 30 fps and the game hits a spike in processing then will chug and lag as the framerate will drop. It really is nice to have that buffer to give you some wiggle room in the higher graphics and processing spikes. Other than that I notice no difference.
 

Thomas Hardy

New member
Aug 24, 2010
31
0
0
Vegosiux said:
Well, I hope those professional level players don't blink then... if you miss an ultimate and blame your frame rate, you're just being dishonest with yourself, unless it's really abysmal. Nobody has reaction times in single digit miliseconds where the difference between 30 and 60 frame rates actually matters.

It's on the same principle you don't measure a football field to the accuracy of a micrometer. The digits beyond a certain point become insignificant to any given observation.
Really? Time and distance are different fundamental units. Its not the same at all. That's like talking about the inches you can bench press.

Other than that, I agree with you 100%

(I even think I remember hearing a company did a study with high-speed cameras that showed that it takes around 8-12 miliseconds to click a mouse from the time the finger starts to move until the actual "click")

That still doesn't change the fact that THE DELAY EXISTS vis a vis "I canna change the laws of physics Cap'n".

If you want to talk pure time advantage then someone living close enough, on a fast enough connection to get around 10 ping will actually experience more lag from his video card than his connection speed. Again, a difference in time exists, that's all I'm saying.


I have to go to bed so I'll leave you with this:

Suppose you and I are playing Warhammer 40k or some other complex table-top war game on a board in another country. Also, suppose turns are not taken sequentially. Instead we send written instructions for our turn and the person owning the actual board sends us a photograph and summary report back once he has recieved and executed instructions from either player. (Before the age of computers this stuff actually happened)

Naturally the person living closest to the board would have the biggest advantage. Not far behind was the person who could devise and send new turn instructions the most quickly. (This has since reversed).

Suppose the person carrying out the written instructions opens either packet of instructions an hour after sending back the first photo and the first player has a section of troops flattened by the second player's set of instructions through luck or guile. The photograph the first player recieves in the mail shows none of this. More importantly, until a fresh photgraph arrives (Perhaps the next day, perhaps later) the first player will not be basing any decisions on accurate information. If that player happens to send "bad" instructions that actually put his troops in a WORSE position, well too bad.

Just for Fun: (If 1 second = 1 day then 0.015 seconds = 21.6 minutes)


Again:

tiny, paltry, miniscule --> yes

insignificant --> no

Regardless of the time scale and regardless of if its size, EVERY game played on a computer is turn-based once it gets to be a list of instructions in a processor. There's no such thing as "Instantaneous" any more only really, really, REALLY stinkin' fast. And Frames Per Second STILL causes a TINY delay that affects decision-making. Because it can affect decision-making, it is INCREDIBLY important, but only in equally incredibly rare circumstances.

The fact that you want to believe it doesn't matter doesn't change anything. From time to time it WILL be the FINAL nigh-unmeasurable amount of time TOO LATE, the turn you forgot to mail until you got back home, the less than 0.1 sec of cooldown you had left when you died, the droids the old man assured you you were not looking for, whatever. An absolutely TINY delay. That makes all the difference.


TL/DR:

Yeah, it almost never matters. Too bad - almost never ain't never.
 

dontlooknow

New member
Mar 6, 2008
124
0
0
shrekfan246 said:
Actually, the lower framerate was really noticeable for me in Metal Gear Solid 4 as well, because there's this one section relatively early in the game where if you lay down or crouch in a certain area inside of a building, the FPS will shoot up to 60 and the entire game will suddenly move a lot smoother, but it goes back down once you start actually moving around again.
Yeah, I found the MGS4 framerate quite irritating - it didn't really affect the gameplay, but it did draw attention to itself. Similar to your example, I found that using the Solid Eye's IR mode made the framerate shoot through the roof, then when it's switched off, all of a sudden the game seems to run significantly slower. In fact, I think I remember MGS3 having a similar, if less pronounced problem. The difference between the two is that MGS4 seems to use v-sync to avoid screen tearing at the sacrifice of a more pronounced framerate range.

I play mostly PS3 and a bit of PC, but given the choice, I'd always go with a locked framerate at the sacrifice of slightly poorer graphical fidelity; if you play one game at 60fps, then the same half an hour later at 30fps, the difference is less obvious / irritating than putting up with a framerate that gets in a tizz whenever an explosion goes off.

Honorable mention should also be made of Bioshock for the PS3 (I haven't played the 360 in about 5 years), which lets you unlock the framerate, resulting in the console chugging itself inside out if you look at anything but the floor direct below you. Is this option included for any reason apart from drawing the grubby console player's attention to the limitations of their system?
 

Vegosiux

New member
May 18, 2011
4,381
0
0
Thomas Hardy said:
Really? Time and distance are different fundamental units. Its not the same at all. That's like talking about the inches you can bench press.
If we're getting nit-picky, time and distance are quantities. Second and meter are units. And my point went completely over your head.

And your "analogy" is making a physicist cry somewhere. Because there's no mention of how accurate the clocks you're measuring the travel time with are.
 

Thomas Hardy

New member
Aug 24, 2010
31
0
0
If you want to be nit-picky be my guest. Its 4 am here and I'm practically smelling colors because of my rampant insomnia. Your "point" isn't one. You're ingoring mine and pretending superiority. In no way are human reflexes SUBTRACTIVE. Just because I blink doesn't mean that I'm going to open my eyes at the exact instant I get a fresh screen. From time to time THAT'S GONNA MATTER and especially in game types like FPSes and MOBAs. When players are executing moves in near-real time, any delay - no matter HOW small - could be the determining factor.

No go misconstrue me if you like. I'm gonna try and get a couple hours before work...
 

Denamic

New member
Aug 19, 2009
3,804
0
0
Showing off?
It just feels sluggish.
25-30 FPS is fine for movies with their motion blur, but when I'm playing a game, in control of the movement, anything below 40 FPS is uncomfortable.
 

Yopaz

Sarcastic overlord
Jun 3, 2009
6,092
0
0
10-20 fps is enough? Yeah... I take it you have never actually played games at that framerate. Anything below 30 is horrible and makes games feel a lot slower and unresponsive. I don't care much as long as it's somewhere between 40-50 and if it's stable.
 

Alexnader

$20 For Steve
May 18, 2009
526
0
0
Da Orky Man said:
DazZ. said:
Windknight said:
Ok, essentially, as I understand it, any frame-rate of about 10-20 or more is enough to provide an illusions of a moving picture.
Have you ever played any game at 20 frames? Preferably some form of FPS as I feel those suffer the most.

It's immensely unplayable, and that's not me that's non gaming friends trying to run things on their laptops.
I currently play Republic Commando on my netbook at about 15 fps with no problems. SUre, it looks somewhat choppy, but perfectly playable.
Friend of mine was terrible at Battlefield 3, or so we all thought. We got him to get BF3 to display what FPS he was at, he was going at 15-20. With use of chest high cover I could sprint up to him, get behind him and do a one hit kill take down with the knife before he could kill me with an assault rifle.

He's not the best player out there but no body is that bad.
 

chadachada123

New member
Jan 17, 2011
2,310
0
0
Yopaz said:
10-20 fps is enough? Yeah... I take it you have never actually played games at that framerate. Anything below 30 is horrible and makes games feel a lot slower and unresponsive. I don't care much as long as it's somewhere between 40-50 and if it's stable.
I take it you were quoting me, since I noted that 10-20fps is legitimately what Minecraft runs at for me, even on tiny draw distance with fast graphics, smooth lighting off, max FPS setting and with lowered particles (that is, lowest settings aside from the second-lowest particle setting).

It's playable, and that's what I've gotten used to since I have no other option. I've put in at least a hundred hours playing like this, including on survival and multiplayer maps.

I just opened up Minecraft and took a screenshot from my new world with the stats-screen open. It was chugging at between 7 and 13 fps, and seemed to be what it normally was when I was in a new or heavily-altered area.

Here it is, and I will willingly provide more examples that have Mozilla closed (despite having only this tab open and comparatively little memory usage from Mozilla atm), if you so prefer:


My laptop just plain sucks, but I'm pretty used to it, *shrug.*