What is this obsession with framerates over 30FPS?

Recommended Videos

Daemonate

New member
Jun 7, 2010
118
0
0
Quote: "To have a *perfect* illusion of everything that can flash, blink and move you shouldn't go below 500 fps....maybe you need as much as 4000fps..."(from link below).

The idea that you can't see more than xx fps so why get more, is an old myth, and a very confused one. It steps up when you need very quick and accurate visual responses - say competitive online FPS play, or flight simulators. I know when I was in a national team for a couple of shooters, the difference between 85 fps and 120 fps could be difference between defeat and victory. The improvement in accuracy was huge! So we'd play at 800x600 with all settings on Lowest, no matter how good our card was. Photographers can tell you that they can see strobe flashes of 1/1000th of a secnd.

The answer is a very long technical dissertation involving many aspects of video and biology.

Short answer: Cinema and computer screens are not the same thing. Even in cinema, there are theoretical gains in viewability up to 48 FPS and maybe beyond, but it's true that most movies look totally fine at 24 fps or even lower. But a modern film projector is actually showing 3 real frames for every 1 apparent frame, with interleaved blackness to hide projector motion, so they're actually closer to 72 fps anyway.

But for computers, well, if you ever had an old CRT, you know the difference between a 60 Hz and 85 Hz refresh rate was astonishing, ergonomically.

From Dans Rutter's DansData site www.dansdata.com:
"If a CRT monitor screen's painted less often than about 75 times per second (refresh rate is measured in Hertz, or Hz), it'll seem to flicker, because there's not enough persistence in the phosphor to keep the screen illumination even from your point of view. You need a 75Hz or better refresh rate to eliminate the flicker.

TVs have a low frame rate - 25 frames per second for PAL, 30 frames per second for NTSC - but they get away with it because they use interlaced mode, scanning all of the odd numbered lines and then all of the even numbered ones, so a 25 frame per second refresh rate becomes a 50 "field" per second screen-painting rate.

Sure, only half of the screen's painted each time, but it's not the top half and then the bottom half - it's a Venetian-blind interleaved pattern, that means the whole thing looks pretty evenly illuminated. This, combined with the higher persistence phosphor, gives a decently flicker-free display.

And, of course, you usually watch TV from far enough away that the screen takes up less of your field of view than does your closely-viewed computer monitor. Smaller images seem to flicker less.

24 frame per second movies, on the other hand, get away with their low frame rate without looking painfully flickery because the whole frame's illuminated in one go each time. And LCD panels work the same way."


LCDs with video games are more complicated, because as the data stream from the video card is organised into lines of pixel states, not whole picture frames, you need to start to talk about things like flicker-rates and de-interlacing algorithms. But if you have ever played a game at 75 FPS and then tried it on your friends PC at 15 FPS, well, again you'd know the difference.

For a more in depth (but still introductory) discussion of the matter, this article, from where I obtained the opening quote:
http://www.100fps.com/how_many_frames_can_humans_see.htm
is good. Basically, the more frames you can get there in a second, the better, but YMMV.
 

veloper

New member
Jan 20, 2009
4,597
0
0
Absolutionis said:
Physicist/Biomedical Engineer here

Your eyes can see at ~24-30fps. In movies, this doesn't matter too much because any frame skips are generally ignored. For anything you have control over, the response lag between your action and the screen needs to be optimized; that's why they shoot for 30fps at least.

There's also something called "Nyquist Frequency" which essentially states that optimally, if you're displaying/detecting information, you should do so at TWICE the frequency of normal operation in order to minimize aliasing (aliasing in frequency, not the graphical aliasing you sometimes see).

In simpler terms, 30fps is fine. Anything less is noticeable by much of the population.
However, 40fps becomes odd in that the fps doesn't divide as evenly into the human 30fps detection rate.
Thus, we shoot for 60fps.
Anything higher than 60fps is a waste because it's more than double the human eye's rate.
Grab an moderately old CRT monitor from the attic, or borrow someone's fancy 120hz gaming monitor then run a demo that let's you toggle the framerates with a keypress.

I tried this for myself. You can actually see the difference between 60hz and 90hz and 120 hz on a rendered rotating object.
If the program is interactive, like shooter than it's even more noticeable.
Theory sucks when the practice is different.
 

Treblaine

New member
Jul 25, 2008
8,682
0
0
Windknight said:
Ok, essentially, as I understand it, any frame-rate of about 10-20 or more is enough to provide an illusions of a moving picture. Indeed, movies and television have a framerate of 24 FPS, and no-one seems to find any problem with them being choppy or slow.
Because with GAMES you aren't merely trying to create the illusion of a small part of a picture moving, but have quick feedback to controls. You have taken an uninformed and simplistic view of both video games and cinematography.

In fact even 30 frames per second is inadequate for the illusion of motion with quick PANNING shots, that is where the whole frame moves you get "the judder". Cinephiles thought this "judder" was an artefact of films being converted from 24-frames to 60-hz Television but it is in fact inherent to the film. If you pan a camera swiftly at 24-frames per second the illusion of motion IS LOST! It only kinda works in two cases:

1) where the attention is on a character or object in the fore ground moving with the camera so it's relative position in the frame is not changing much they seem to move fluidly but around them the screen is clealry jeerking/juddering. Any game where you control the camera will be full of equivalent of "fast-panning-shots" and blurring is no solution.

2) blur. That is the ONLY SOLUTION to low framerate and this is UNACCEPTABLE to games. Ask any cinematographer who films in 24 frames per second, they HAVE to know how to reduce the aperture for filming things fast so that each frame showing something moving fast blurs it so that for example it doesn't look like a slide show of Bruce Lee's leg suddenly going from on the floor to fully raised, both frames ar eblurred together.

Bottom line: 24-frames-per-second ONLY WORKS (vaguely) for film because of BLURRING of fast moving things and doesn't work at all for fast panning shots. 24fps = illusion of motion is a MYTH!! It's not that simple.

PS: you have basically said all gamers are shallow fools for wanting 60fps and act like you know everything about something you clearly haven't done the bare basic research on. Bravo.
 

Samurai Silhouette

New member
Nov 16, 2009
491
0
0
I put 5k$ into my computer and can run all games at 60fps and some at 120. Being a former console player, I find it no real advantage whatsoever.
 

Thaliur

New member
Jan 3, 2008
617
0
0
Windknight said:
Ok, essentially, as I understand it, any frame-rate of about 10-20 or more is enough to provide an illusions of a moving picture.
Technically, and in recorded or prerendered movies, yes.
The problem with real-time rendering is that it does not create motion artifacts, which we are used to. Normally, when you are looking wt the world with your eyes, movement is blurry, which your brain compensates. On a real-time-rendered scene, you get a series of pictures with no blurring connecting them (and the motion blur effect employed in many modern games does not get it right either). Because of this, the only way to get a "natural" picture is an extremely high framerate.

Framerates higher than the screen refresh rates are just ridiculous though. If your system generates 120 frames per second, and your monitor refreshes 60 times per second (a VGA-connected TFT screen, for example), you are just wasting half of your systems work cycles.
 

Daemonate

New member
Jun 7, 2010
118
0
0
Absolutionis said:
Physicist/Biomedical Engineer here

Your eyes can see at ~24-30fps. In movies, this doesn't matter too much because any frame skips are generally ignored. For anything you have control over, the response lag between your action and the screen needs to be optimized; that's why they shoot for 30fps at least.

There's also something called "Nyquist Frequency" which essentially states that optimally, if you're displaying/detecting information, you should do so at TWICE the frequency of normal operation in order to minimize aliasing (aliasing in frequency, not the graphical aliasing you sometimes see).

In simpler terms, 30fps is fine. Anything less is noticeable by much of the population.
However, 40fps becomes odd in that the fps doesn't divide as evenly into the human 30fps detection rate.
Thus, we shoot for 60fps.
Anything higher than 60fps is a waste because it's more than double the human eye's rate.
Sorry, but this is simply not accurate. Please see my post above.
 
Jun 11, 2008
5,331
0
0
The Heik said:
Daystar Clarion said:
The Heik said:
Daystar Clarion said:
Yeah, some of the combos are insanely tight in that game, you'd never pull them off at 30FPS.
I call bullshit on this statement.

At 30 FPS, there is .03 seconds between each frame. At 60 FPS there are .0167 seconds between frames. That's already a pretty miniscule difference, but it's made all the more irrelevant mechanically by the fact that the world's fastest human reaction times are .100 seconds, 6 times slower than the frametime at 60 FPS. If the the difference between 30 and 60 FPS is a workable amount for you, then congratulations you're officially superhuman, but no regular member of the homo sapiens species could actually utilize such a meager amount of time with any significant measurable difference in player capability.
Which would be a valid criticism if combos were down to individual frames, which they're not.
Actually they are, as ultimately all interactions are based upon the frames (which are each continuations of the game). Can't pull off a combo if there are no frames to progress the action.

My point though is that having 60 FPS over 30FPS does not add enough additional data data for even the fastest human brains to actually use. Knowing that a hadouken fireball is coming .0167 seconds faster is not going to measurably help in combat, because it still takes your brain at least .1 seconds (though on average it's more like .2) to be able to recognize it and react to it. Ergo, a higher frame rate does not mechanically change the game after 30-40 FPS. If it does for you, then it's probably just a mental placebo, not an actual tactical advantage.
There are places in maps you can only get to in CoD 2 if your fps is 333 or above so yes in games higher FPS does make a difference.
 

Smooth Operator

New member
Oct 5, 2010
8,162
0
0
Because some people have higher standards then others, if you are the sort of person who does not mind slow response on games then this clearly will not bother you.

But it bothers others because:
- our eyes actually see an infinite amount of frames
- time between frames adds lag on input and output
- added lag increases overall response time and the game starts to feel disconnected
 

More Fun To Compute

New member
Nov 18, 2008
4,061
0
0
The Heik said:
More Fun To Compute said:
That's not how it works in terms of the whole system. A very finely tuned 60fps game has around 66ms of input latency while solid 30fps has twice that although it often goes up to over 200ms if the engine is not tuned and the hdtv is laggy or whatever.

And also in fighting games losing every other frame of animation is losing a lot of data about what is happening. Is this move x or move y, how obvious is the difference and so on. With 60fps developers can convey the same amount of visual information to the player in a shorter amount of time making action games feel a lot faster.
On your first point, you're arguing on an individual basis, which is hardly an objective point of reference. Of course a well tuned machine is going to function better than one that isn't. However, if both a 30 FPS system and a 60 FPS system are well tuned, the difference is negligible at best.

On your second point, like I mentioned before, .0167 seconds is not enough for the human brain to work with. By the time the information has been processed and reacted upon, 6 or more frames have already passed by, and that's at the very best speed humanity can offer (a trait that exists in less than .1% of the population). To say that a single frame lends so much information to a game is arguing against a fundamental physical limit of the human body. There is no one on Earth, past or present, who could properly use such a miniscule difference to any significant result.

And besides, if a game (ostensibly a form of entertainment) requires that players be the pinnacle of human mental capability to to be able to play properly, I'd chalk that up more to unfeasible game design rather than a lack of information.

So let me say again, no one needs faster than 40 frames per second for their games. 60 FPS does add to the visual smoothness, but in terms of actual actions and reaction, no one is going to be able to use what little is gained from the increase in FPS.
I don't think you understand what I wrote but I'm not sure how to make it clearer.

Just to restate it though, the times we are talking about between pushing a button and seeing the results on screen are not .0167 seconds but .066 seconds. Double that to .132 and we are not talking about something that can only throw "super humans" off their game if the playing field is not level although maybe someone just mashing buttons because it's fun wouldn't care or notice. But in reality that .132 second delay is the absolute best case for a 30fps game. Some titles that are ostensibly "30fps" games can have more than .2 seconds of input lag which puts them well in the category of any casual player with average human wiring knowing that something is off.

And please don't start with 40fps. What consumer hardware do you know that has a refresh rate of 40hz or even 20 or 80?

Edit: I just want to restate this to make it even more clear. The timing number you have, you have to multiply it by a factor of minimum 3 (very rare), normal 4 (regular which is the factor I used) or more than 5 (depressingly common). This is because it takes a game multiple frames to process your input then display and image. Games are a feedback loop which is half shitty game engines and hardware that adds lag and half our slow meat brains processing what is happening and reacting.
 

The Heik

King of the Nael
Oct 12, 2008
1,568
0
0
Glademaster said:
The Heik said:
Daystar Clarion said:
The Heik said:
Daystar Clarion said:
Yeah, some of the combos are insanely tight in that game, you'd never pull them off at 30FPS.
I call bullshit on this statement.

At 30 FPS, there is .03 seconds between each frame. At 60 FPS there are .0167 seconds between frames. That's already a pretty miniscule difference, but it's made all the more irrelevant mechanically by the fact that the world's fastest human reaction times are .100 seconds, 6 times slower than the frametime at 60 FPS. If the the difference between 30 and 60 FPS is a workable amount for you, then congratulations you're officially superhuman, but no regular member of the homo sapiens species could actually utilize such a meager amount of time with any significant measurable difference in player capability.
Which would be a valid criticism if combos were down to individual frames, which they're not.
Actually they are, as ultimately all interactions are based upon the frames (which are each continuations of the game). Can't pull off a combo if there are no frames to progress the action.

My point though is that having 60 FPS over 30FPS does not add enough additional data data for even the fastest human brains to actually use. Knowing that a hadouken fireball is coming .0167 seconds faster is not going to measurably help in combat, because it still takes your brain at least .1 seconds (though on average it's more like .2) to be able to recognize it and react to it. Ergo, a higher frame rate does not mechanically change the game after 30-40 FPS. If it does for you, then it's probably just a mental placebo, not an actual tactical advantage.
There are places in maps you can only get to in CoD 2 if your fps is 333 or above so yes in games higher FPS does make a difference.
What are you on about?

First, what the heck does a certain level of frame rate suddenly make it so a player can access new parts of the map? Does going that fast suddenly make the player character be able to phase through walls or something? Seriously, you are being incredibly vague with what is happening and how.

Second, I think you're pulling a strawman argument here. My post was about how 60 FPS doesn't increase one's capability to react properly, due to the human best of a .1 second reaction time putting a base limit on the player's ability to respond to new threats, so an additional .0167 seconds would not noticeably increase combat capability. You seem to be arguing about somehow a higher frame rate suddenly gives access to previously inaccessible parts of the game.

Not sure how you came to that conclusion from my post.
 

The Heik

King of the Nael
Oct 12, 2008
1,568
0
0
More Fun To Compute said:
I don't think you understand what I wrote but I'm not sure how to make it clearer.

Just to restate it though, the times we are talking about between pushing a button and seeing the results on screen are not .0167 seconds but .066 seconds. Double that to .132 and we are not talking about something that can only throw "super humans" off their game if the playing field is not level although maybe someone just mashing buttons because it's fun wouldn't care or notice. But in reality that .132 second delay is the absolute best case for a 30fps game. Some titles that are ostensibly "30fps" games can have more than .2 seconds of input lag which puts them well in the category of any casual player with average human wiring knowing that something is off.

And please don't start with 40fps. What consumer hardware do you know that has a refresh rate of 40hz or even 20 or 80?

Edit: I just want to restate this to make it even more clear. The timing number you have, you have to multiply it by a factor of minimum 3 (very rare), normal 4 (regular which is the factor I used) or more than 5 (depressingly common). This is because it takes a game multiple frames to process your input then display and image. Games are a feedback loop which is half shitty game engines and hardware that adds lag and half our slow meat brains processing what is happening and reacting.
*sigh*

Ok, let me make it clear for you. I am not talking about the technology here. I am not talking about the individual systems characteristics, or about how shitty certain technology is.

I am talking about how the human brain simply can't compute fast enough for a 60 FPS system to make any personally noticeable difference in combat capability over the basic 30 FPS model, if all other parameters are equal. It doesn't matter if it's 60 FPS or 60,000 FPS, there will always be at least a .1 second lag for the signals to reach the brain, for the brain to process the information, and then send a signal out to appropriately react. Remember, we human see the world at the speed of light, and yet enough slow motion capture shows how long it takes us to react to something in comparison to when it happens.

Once the FPS exceed the brain ability to properly process and respond to something, it honestly doesn't matter how fast the frames go by. You'll still react with the same speed. All a high frame rate does it make the picture look smoother. That's it. There is no mechanical difference, and you will still play with the same effective level of combat capability.

So please, stop harping on about specific tech platforms. It has nothing to do with what I am arguing, ergo it is completely and totally irrelevant to my original point.
 
Jun 11, 2008
5,331
0
0
The Heik said:
Glademaster said:
The Heik said:
Daystar Clarion said:
The Heik said:
Daystar Clarion said:
Yeah, some of the combos are insanely tight in that game, you'd never pull them off at 30FPS.
I call bullshit on this statement.

At 30 FPS, there is .03 seconds between each frame. At 60 FPS there are .0167 seconds between frames. That's already a pretty miniscule difference, but it's made all the more irrelevant mechanically by the fact that the world's fastest human reaction times are .100 seconds, 6 times slower than the frametime at 60 FPS. If the the difference between 30 and 60 FPS is a workable amount for you, then congratulations you're officially superhuman, but no regular member of the homo sapiens species could actually utilize such a meager amount of time with any significant measurable difference in player capability.
Which would be a valid criticism if combos were down to individual frames, which they're not.
Actually they are, as ultimately all interactions are based upon the frames (which are each continuations of the game). Can't pull off a combo if there are no frames to progress the action.

My point though is that having 60 FPS over 30FPS does not add enough additional data data for even the fastest human brains to actually use. Knowing that a hadouken fireball is coming .0167 seconds faster is not going to measurably help in combat, because it still takes your brain at least .1 seconds (though on average it's more like .2) to be able to recognize it and react to it. Ergo, a higher frame rate does not mechanically change the game after 30-40 FPS. If it does for you, then it's probably just a mental placebo, not an actual tactical advantage.
There are places in maps you can only get to in CoD 2 if your fps is 333 or above so yes in games higher FPS does make a difference.
What are you on about?

First, what the heck does a certain level of frame rate suddenly make it so a player can access new parts of the map? Does going that fast suddenly make the player character be able to phase through walls or something? Seriously, you are being incredibly vague with what is happening and how.

Second, I think you're pulling a strawman argument here. My post was about how 60 FPS doesn't increase one's capability to react properly, due to the human best of a .1 second reaction time putting a base limit on the player's ability to respond to new threats, so an additional .0167 seconds would not noticeably increase combat capability. You seem to be arguing about somehow a higher frame rate suddenly gives access to previously inaccessible parts of the game.

Not sure how you came to that conclusion from my post.
You're saying FPS does not measurably change gameplay. That is not a straw man. I am saying that is does and the reason it does it that it affects your jump mechanics in the game as with the higher fps you can fit your jump frame on to areas you normally can't get to. Also you you can see a big difference between someone using a semi auto weapon at 30 60 and the usual server limit of 125 especially if people can roll their mouse properly.

So no it is not a straw man and source I've played a lot of scrims with my old clans and zombie maps.

EDIT: Also automatic weapons shoot faster with higher frames.
 

Twilight_guy

Sight, Sound, and Mind
Nov 24, 2008
7,131
0
0
From what I've heard in order top create the illusion of motion you need around 24 FPS. I suppose the human brain might be able to precieve a difference from 30 to 60 FPS, maybe smoother animation like going from purple to fuschia. At some point though changes in FPS become imperceivable, like going from 0x2299FF to 0x2299FE. People talking about FPS in the hundreds are probably arguing over something they physically cannot see.

As for me, I've played console and PC games all my life and I've never seen anything different enough for me to consciously go 'Yep, that's different'. I only notice when it drops below 24 FPS and I have a slide show. Of course then again I can barely notice the difference between SD and HD so maybe I'm just a weirdo. It means I can buy cheap stuff and be just as happy though!
 

Poetic Nova

Pulvis Et Umbra Sumus
Jan 24, 2012
1,974
0
0
I don't see much of a diffirence, but you hear this from a guy who used to play games at 15 fps
 

The Heik

King of the Nael
Oct 12, 2008
1,568
0
0
Glademaster said:
The Heik said:
Glademaster said:
The Heik said:
Daystar Clarion said:
The Heik said:
Daystar Clarion said:
Yeah, some of the combos are insanely tight in that game, you'd never pull them off at 30FPS.
I call bullshit on this statement.

At 30 FPS, there is .03 seconds between each frame. At 60 FPS there are .0167 seconds between frames. That's already a pretty miniscule difference, but it's made all the more irrelevant mechanically by the fact that the world's fastest human reaction times are .100 seconds, 6 times slower than the frametime at 60 FPS. If the the difference between 30 and 60 FPS is a workable amount for you, then congratulations you're officially superhuman, but no regular member of the homo sapiens species could actually utilize such a meager amount of time with any significant measurable difference in player capability.
Which would be a valid criticism if combos were down to individual frames, which they're not.
Actually they are, as ultimately all interactions are based upon the frames (which are each continuations of the game). Can't pull off a combo if there are no frames to progress the action.

My point though is that having 60 FPS over 30FPS does not add enough additional data data for even the fastest human brains to actually use. Knowing that a hadouken fireball is coming .0167 seconds faster is not going to measurably help in combat, because it still takes your brain at least .1 seconds (though on average it's more like .2) to be able to recognize it and react to it. Ergo, a higher frame rate does not mechanically change the game after 30-40 FPS. If it does for you, then it's probably just a mental placebo, not an actual tactical advantage.
There are places in maps you can only get to in CoD 2 if your fps is 333 or above so yes in games higher FPS does make a difference.
What are you on about?

First, what the heck does a certain level of frame rate suddenly make it so a player can access new parts of the map? Does going that fast suddenly make the player character be able to phase through walls or something? Seriously, you are being incredibly vague with what is happening and how.

Second, I think you're pulling a strawman argument here. My post was about how 60 FPS doesn't increase one's capability to react properly, due to the human best of a .1 second reaction time putting a base limit on the player's ability to respond to new threats, so an additional .0167 seconds would not noticeably increase combat capability. You seem to be arguing about somehow a higher frame rate suddenly gives access to previously inaccessible parts of the game.

Not sure how you came to that conclusion from my post.
You're saying FPS does not measurably change gameplay. That is not a straw man. I am saying that is does and the reason it does it that it affects your jump mechanics in the game as with the higher fps you can fit your jump frame on to areas you normally can't get to. Also you you can see a big difference between someone using a semi auto weapon at 30 60 and the usual server limit of 125 especially if people can roll their mouse properly.

So no it is not a straw man and source I've played a lot of scrims with my old clans and zombie maps.
Ok you seem to misunderstand the term gameplay here. Gameplay is how someone interacts with a game world. My argument is not about that. It is about how after 30 FPS (approximately after the point where a individual would interpret a bunch of frames as a single continuous image), any additional FPS wouldn't add to reactionary capability. This is a principle equally applicable to film or even real life. In fact real life is probably the best example of what I'm arguing, as the human eye sees things at the speed of light (which is for all intents and purposes infinite FPS in this case) and yet there are still things in this universe that can occur faster than out minds can actively register all in one go. Reaction time is a fundamental limit of the human brain, and once what is happening exceeds that limit, it doesn't really matter how much it exceeds by.

And now that you've given me a bit more information on the CoD 2 FPS issue you've mentioned, I can now figure out what it actually is. The situation you speak of is most definitely a glitch caused by the higher frame rate creating a more definitive path for the computer to calculate where the player could land. CoD 2 was never meant to run at 333+ FPS, so the game's design didn't account for that in the level design. It's a technological quirk, not proper game mechanics, so it is hardly relevant to any part of this thread's discussion on the 30 FPS as opposed to 60 FPS.
 

More Fun To Compute

New member
Nov 18, 2008
4,061
0
0
The Heik said:
*sigh*

Ok, let me make it clear for you. I am not talking about the technology here. I am not talking about the individual systems characteristics, or about how shitty certain technology is.

I am talking about how the human brain simply can't compute fast enough for a 60 FPS system to make any personally noticeable difference in combat capability over the basic 30 FPS model, if all other parameters are equal. It doesn't matter if it's 60 FPS or 60,000 FPS, there will always be at least a .1 second lag for the signals to reach the brain, for the brain to process the information, and then send a signal out to appropriately react. Remember, we human see the world at the speed of light, and yet enough slow motion capture shows how long it takes us to react to something in comparison to when it happens.

Once the FPS exceed the brain ability to properly process and respond to something, it honestly doesn't matter how fast the frames go by. You'll still react with the same speed. All a high frame rate does it make the picture look smoother. That's it. There is no mechanical difference, and you will still play with the same effective level of combat capability.

So please, stop harping on about specific tech platforms. It has nothing to do with what I am arguing, ergo it is completely and totally irrelevant to my original point.
You are just wilfully trying to misunderstand now.

There is a limit to how fast the nervous system can process information. There is a limit to how fast the technology can process information. In a game they work together in a feedback loop. When people make games they have some control over how fast the game code can process information and one of the best and most reliable ways to do that is to switch from 30fps to 60fps.

Let's say I take on average .11 s to respond and I'm playing spot the sausage at 120fps which adds a further 0.025ms to the time it takes to react because I believe that higher frame rates are gud.

You are playing against me with your average .1 s reaction time but you are running at an unstable 30fps because you have science on your side and know that a mere raise in framerate adds no difference but in reality it adds .15 s to your time to react.

My average time to spot the sausage is .135 and yours is .25 so I win the sausage spotting tournament despite having worse reactions.
 

Noswad

New member
Mar 21, 2011
214
0
0
It's because all PC gamer's are secretly pigeons and therefore need the extra FPS so the games do not look like slide shows.

No but in all seriousness the difference it makes does not even register on my buy it on console or buy steroids for my PC argument.
 
Jun 11, 2008
5,331
0
0
The Heik said:
Glademaster said:
The Heik said:
Glademaster said:
The Heik said:
Daystar Clarion said:
The Heik said:
Daystar Clarion said:
Yeah, some of the combos are insanely tight in that game, you'd never pull them off at 30FPS.
I call bullshit on this statement.

At 30 FPS, there is .03 seconds between each frame. At 60 FPS there are .0167 seconds between frames. That's already a pretty miniscule difference, but it's made all the more irrelevant mechanically by the fact that the world's fastest human reaction times are .100 seconds, 6 times slower than the frametime at 60 FPS. If the the difference between 30 and 60 FPS is a workable amount for you, then congratulations you're officially superhuman, but no regular member of the homo sapiens species could actually utilize such a meager amount of time with any significant measurable difference in player capability.
Which would be a valid criticism if combos were down to individual frames, which they're not.
Actually they are, as ultimately all interactions are based upon the frames (which are each continuations of the game). Can't pull off a combo if there are no frames to progress the action.

My point though is that having 60 FPS over 30FPS does not add enough additional data data for even the fastest human brains to actually use. Knowing that a hadouken fireball is coming .0167 seconds faster is not going to measurably help in combat, because it still takes your brain at least .1 seconds (though on average it's more like .2) to be able to recognize it and react to it. Ergo, a higher frame rate does not mechanically change the game after 30-40 FPS. If it does for you, then it's probably just a mental placebo, not an actual tactical advantage.
There are places in maps you can only get to in CoD 2 if your fps is 333 or above so yes in games higher FPS does make a difference.
What are you on about?

First, what the heck does a certain level of frame rate suddenly make it so a player can access new parts of the map? Does going that fast suddenly make the player character be able to phase through walls or something? Seriously, you are being incredibly vague with what is happening and how.

Second, I think you're pulling a strawman argument here. My post was about how 60 FPS doesn't increase one's capability to react properly, due to the human best of a .1 second reaction time putting a base limit on the player's ability to respond to new threats, so an additional .0167 seconds would not noticeably increase combat capability. You seem to be arguing about somehow a higher frame rate suddenly gives access to previously inaccessible parts of the game.

Not sure how you came to that conclusion from my post.
You're saying FPS does not measurably change gameplay. That is not a straw man. I am saying that is does and the reason it does it that it affects your jump mechanics in the game as with the higher fps you can fit your jump frame on to areas you normally can't get to. Also you you can see a big difference between someone using a semi auto weapon at 30 60 and the usual server limit of 125 especially if people can roll their mouse properly.

So no it is not a straw man and source I've played a lot of scrims with my old clans and zombie maps.
Ok you seem to misunderstand the term gameplay here. Gameplay is how someone interacts with a game world. My argument is not about that. It is about how after 30 FPS (approximately after the point where a individual would interpret a bunch of frames as a single continuous image), any additional FPS wouldn't add to reactionary capability. This is a principle equally applicable to film or even real life. In fact real life is probably the best example of what I'm arguing, as the human eye sees things at the speed of light (which is for all intents and purposes infinite FPS in this case) and yet there are still things in this universe that can occur faster than out minds can actively register all in one go. Reaction time is a fundamental limit of the human brain, and once what is happening exceeds that limit, it doesn't really matter how much it exceeds by.

And now that you've given me a bit more information on the CoD 2 FPS issue you've mentioned, I can now figure out what it actually is. The situation you speak of is most definitely a glitch caused by the higher frame rate creating a more definitive path for the computer to calculate where the player could land. CoD 2 was never meant to run at 333+ FPS, so the game's design didn't account for that in the level design. It's a technological quirk, not proper game mechanics, so it is hardly relevant to any part of this thread's discussion on the 30 FPS as opposed to 60 FPS.
Ok I'm going to give this one more whack. The FPS in CoD directly changes how you affect the game world by being able to shoot faster and have a more accurate and higher jump height glitch or not. Personally, it is hard to notice a change in FPS when I go up. So if I was playing at 30 fps for awhile and then it went up to 60 fps I would not notice as much as I haven't watch various FPS animations online. The reverse is not true. 60 FPS is a lot more fluid and gives you a clearer representation of the game world allowing and the game to react more quickly to each other.

So yes frames does affect how accurately gameplay is portrayed in relation to how quick it related my input to the world although online ping affects this too. So given that this does affect the accuracy of how things occur in the game I would say yes it does affect a person's reaction time relative to the game world regardless of the individual's perception of the fluidity of the frame difference.

The more often there is sharp and quick movement the bigger FPS makes a difference. More importantly in fast paced games it feels a hell of a lot different and more smooth.