Idea: What if console users were given a choice in Framerate.

Recommended Videos

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
CrystalShadow said:
I cannot remember a single PC game where the difference between minimum settings and maximum really ever amount to much more than 20 fps, maybe 30 if you're lucky...
When was a last time you played a AAA title? Because i cannot remmeber the last game where that wasnt a case. outside of very badly done console ports that have basically no meaningful settings. Alternatively its possible you just have a very weak PC so the difference is low because it cant run it either way.

CrystalShadow said:
People say they prefer higher framerates, but if they saw the consequences for what games end up looking like as a result, would they really think it worth it in the end?
Almost every PC gamer i know already makes this choice, so yeah, we would. But really, there is no reason we cannot have both in 2015. the tech is certainly there.

MonsterCrit said:
Nope just your bias towards bigger numbers.
Yes, i admit i am biased for better gaming experience.

Only to those who deem it as such. Again, a coming designed around being black and white wille look better than one that was just a desaturated colour comic. You seem to be unaware that many so called 60fps games aren't truly 60, they're 30fps games with double frames. ie they aren't showing more incremental movement frames they're just holding each frame on the screen twice as long. :p
Vegetables are objectively healthier than McDonalds burgers even if someone prefers to eat at McDonalds. Someones bad choices does not prevent things from being better than other things.

Also, lol, thats absolute nonsense. 60 fps games are drawing 60 different frames. in some games there are animations locked to 30 fps, and pretty much every time this happens its jarring and people complain en-masse about it. What you are talking about here is interlaced display method which has came out of fashion in the 90s an im not aware of even a single game this generation to use it (there was a few last generation on consoles, but only on consoles).

the joke is that without the fps counter on the screen, most of the fps elite could tell the fps of their games.
Tell the exact framerate - perhaps not. tell a clear difference between 30 and 60 - existing tests already prove it.

heres a quick test for you, in a videogame: http://www.30vs60fps.com/

Not in the way you think though. And remember, what I said, the ROI on that decreases. the difference in effect between 100 and 200 would basically be appreciable by maybe half a percentage of the human population, and even for them the difference is slight. In fact the joke is, sometimes having more frames makes the movement look more jerky. see since your eyes are discarding frames theres no real wway of telling which frames your eyes are dropping so if it's dropping the wrong one...every so often you can get what are perceived as weird hitches. THe truith is 100fps and above have been rather extensively studied... in the area of film production.
ROI is decreasing, but that dropoff is way above 60 fps line. with the 90-100 estimate i think you may be correct, but the discussion here is 30 vs 60 where the difference is very pronounced.

Btw, eyes do not "Drop" frames. humasn dont see in frames. they see constant fluid motion. its about how quickly your brain can process what you see. more frames never make the motion look more jerky unless there is bugs where the animations mess up at high framerate (like physics being tied to framerate in Skyrim).

In fact i'd almost be willing to bet you couldn't tell the difference between 90fps and 120 fps. I'm willing to bet that you and most others placed under double or reverse blind conditions wouldn't know the difference.
well we know that military tests tell us that people do tell the difference. also stop shifting the goalposts, the thread is about 30 fps vs 60 fps, not 90 fps vs 120 fps.


spartan231490 said:
Unsurprisingly, you missed the point. Yes, it's "objectively" better, but it doesn't matter because you can't subjectively tell a substantial difference. Sure, if you're one of the .0001% of gamers who plays reaction based games at a professional level, that 60 fps is gonna matter. For the rest of us, it just doesn't. If you got out of your own way and let go of your confirmation bias you wouldn't have any problems whatsoever playing at the same level and same experience at 30fps.
yes, i can, and so can you, and so can everyone with correctly functioning vision. its more like the 0.0001% who have vision problems are the ones that cannot tell the difference. ive played various games at various framerates from as low as 12 to as high as 144. i know the differences. and there is plenty.
 

ultravio1et

New member
Nov 12, 2015
3
0
0
this was available in Bioshock

there is a button to disable V-Sync for a more stable frame-rate. shame they didn't have that feature while playing 4-player Goldeneye :)
 

spartan231490

New member
Jan 14, 2010
5,186
0
0
Strazdas said:
spartan231490 said:
Unsurprisingly, you missed the point. Yes, it's "objectively" better, but it doesn't matter because you can't subjectively tell a substantial difference. Sure, if you're one of the .0001% of gamers who plays reaction based games at a professional level, that 60 fps is gonna matter. For the rest of us, it just doesn't. If you got out of your own way and let go of your confirmation bias you wouldn't have any problems whatsoever playing at the same level and same experience at 30fps.
yes, i can, and so can you, and so can everyone with correctly functioning vision. its more like the 0.0001% who have vision problems are the ones that cannot tell the difference. ive played various games at various framerates from as low as 12 to as high as 144. i know the differences. and there is plenty.
I play games at both 30 and 60 fps every day. If they're not side by side, most people can't really tell the difference. it's just confirmation bias. Even those that can, that tiny difference isn't going to effect their experience, unless, as stated, they're playing games at a ridiculously high level.
 

Rack

New member
Jan 18, 2008
1,379
0
0
It really isn't as easy as you might think. The current gen of consoles have pretty weak cpus so getting to 60fps isn't going to be as simple as dropping shadow quality, AA, resolution or what have you. It would be a similar effect to the one CrystalShadow mentioned, where dropping the graphics presets doesn't affect quality much. If a game is CPU limited most of the options will have a limited effect impact on the framerate and console games are usually built with the limitations of the system in mind.
 

BeerTent

Resident Furry Pimp
May 8, 2011
1,167
0
0
Lightspeaker said:
BeerTent said:
Whats not plenty fine is going around and saying, "you dont care about this thing that clearly doesnt fucking matter unless its below 20? Get out, scrub!" Of all of the dumb shit I've seen today, some responses in here rank up there. Most people want to pop a disc in, double click the exec, and have fun. Not all of us are competing in our singleplayer, or even multiplayer games, where the push for a higher framerate actually matters.

Fellow master race... You... *Sigh*
You claim that FPS doesn't matter 'unless its below 20' and then outright state the exact circumstances in which it DOES matter. Directly contradicting your point.

Then act all exasperated with people who are stating that it does, in fact, matter.

............really?


And will people seriously stop saying "oh its just about making the game prettier". No its not. Textures just make the game prettier. Shaders just make the game prettier. Anti-aliasing just makes the game prettier. Framerates directly affect your game performance. It DOES matter. Factually it matters and it impacts your gameplay experience. Whether you CARE that it matters is an entire other thing. But just because you don't care doesn't mean it doesn't matter. Lots of people don't care about microtransactions in full-price games and exploitative DLC practices, but that doesn't mean they don't matter and that they're not an issue worth addressing.
Okay, let me reiterate because your bold typeface missed a key part of that sentence... FOX NEWS.

Competitive multiplayer, is where it might matter. On PC's where everyone is a special snowflake, having an additional 30frames will lead to a more responsive player and quicker reaction times than the player at 60fps. I'm a little out of it, but last I checked, I haven't seen a whole lot of consoles in a lot of those competitive gaming scenes. I've yet to hear of a clan forming for a console game, and setting up weekly practice games with other clans. I've heard of a Halo League, but was that for console, or PC? Probably PC.

If every single player is the exact same carbon copy of each-other, locked at exactly 30 frames a second I fail to see how it's going to matter. Especially when there's no shiny gold sticker star for any of those twitchy shooters that Consoles... Can't really produce because they've got two joysticks and no mouse. FPS players adapt differently, going for a more timing based approach to putting bullets in other things.

Finally, I don't play multiplayer games competitively. Unless you do, this FPS debate is practically fucking moot. My machine cost me $400 and FO4 can run(Minus bugs and hitches) at 60 and look alright. Having 60 more frames than me is really giving you a biiig advantage against that dog AI.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
spartan231490 said:
I play games at both 30 and 60 fps every day. If they're not side by side, most people can't really tell the difference. it's just confirmation bias. Even those that can, that tiny difference isn't going to effect their experience, unless, as stated, they're playing games at a ridiculously high level.
yes, they can. The difference is massive enough to have actual gameplay benefits. you just talking a bunch of nonsense and hoping it will stick. framerate problems always plagued videogames but it usually was excused by poor hardware. we dont have that excuse anymore, there is no reason whatsoever to ever play a game at 30 fps in 2015.

BeerTent said:
Competitive multiplayer, is where it might matter. On PC's where everyone is a special snowflake, having an additional 30frames will lead to a more responsive player and quicker reaction times than the player at 60fps. I'm a little out of it, but last I checked, I haven't seen a whole lot of consoles in a lot of those competitive gaming scenes. I've yet to hear of a clan forming for a console game, and setting up weekly practice games with other clans. I've heard of a Halo League, but was that for console, or PC? Probably PC.
so i take it you never heard of Halo, or Destiny, or Titanfall, or Call of Duty, or a whole number of other multiplayer games on consoles?

If every single player is the exact same carbon copy of each-other, locked at exactly 30 frames a second I fail to see how it's going to matter.
If everyone is starving it doesnt matter since everyone is equal, right? modern equality at its finest, lets drag everyone down to the worst level we can, for equality!

Unless you do, this FPS debate is practically fucking moot.
FPS debate is moot, but not for that reason. its moot because higher FPS is objectively better and no amount of whining about it is going to change that. it is effective in EVERY game. yes, even the singleplayer quicktime events like Until Dawn and Telltale games.

Having 60 more frames than me is really giving you a biiig advantage against that dog AI.
Its not about advantage. its about gameplay. game plays smoother and is more responsive, thus it is far more "nicer" to play it. it "Feels right". even the FPS above your monitor display gives you responsiveness increase and are not moot. though ones you can see is of course better.
 

BeerTent

Resident Furry Pimp
May 8, 2011
1,167
0
0
Strazdas said:
[Children are starving in Ethiopia, therefore we need a higher framerate. Equality!]
Okay, because you're Strazdas, and I like your posts I'm gonna keep takin' ya seriously. I shouldn't! But I am.

Also, I think there's a fuckup in communication on my part.

When I talk Competitive multiplayer, I'm talking THIS! [http://www.godisageek.com/wp-content/uploads/ESL-One-Katowice-.jpg]

I'm not talking about this. [https://haloreachinfo.files.wordpress.com/2010/04/matchmaking.jpg] Or this, [http://cdn-static.gamekult.com/gamekult-com/images/photos/30/50/17/19/payday-2-screenshot-ME3050171967_2.jpg] Or going to the local EBgames to win one of these [http://www.jeancoutu.com/Global/Cartes-prepayees/ebgames-bouquet.png] once every 2 years.

This is why I say, we need that higher frame rate in competitive multiplayer. Because you need just about every edge you can get. Even the tiniest ones. But maybe this is an antiquated PC player view, where we all played DoD in Mom's basement, hoping to win STA.

Next up, going with OP's suggestion of having options for graphics vs options for framerate. It's just not going to fly on consoles. Those machines are no more powerful than mine and that's goddamn pitiful. When you're building for the general audience, you want your game to be as pretty as humanly possible, but when you're building for the cheapist budget computer, corners need to be cut. Not to mention most console players don't want more options. They want to just press X for new game and go. I can't stand consoles because of the controller, and I couldn't remap my keys. Why spend more money and time when we can just have one uniform decision in regards to graphics?

Which brings me to multiplayer. We can all agree, that the person running at 10fps on any game is going to get his ass kicked by other players. What if we played Halo, and because you went to the options and set your framerate to 60, and started rolling all of the 30 FPS players? We're separating the community in two. Those who want visuals may not want to play with those who have a slightly better reaction time.

Now, I'm okay with getting people on consoles up to 60, fuck, make it 120 frames. But having it as an option in the game is a dumb-ass idea. That's where I'm having the trouble with it. But with the overall power of the machines in question, and the graphics demand of some games, Console players get quite a big handicap to handle the different control scheme and lower framerate. The higher framerate isn't as much of a pressing matter for consoles as it is for PC. We should also factor in that Consoles are carbon copies. Their games also need to be carbon copies. PC's are completely unique, so we need those settings to tweak how we see fit.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Wasted said:
inu-kun said:
So, "I'd like shittier graphics to get a frame rate that I won't see the difference without a side by side comparison and doesn't matter anyways since I'm over a meter away from the TV"?

Care to explain the logic in it?
If you cannot see a difference between 30/60/120+ fps then you should consider seeing a neurologist, something could honestly be wrong with the way your brain processes visual stimuli. Whether or not you care between high and low frame rates is a separate matter. A silly one, but still ultimately down to personal choice.

"30fps is more cinematic!"
No, you misunderstand him and people like me. It's not that we can't see a difference. It's that we can and we literally do not give a shit. If the game is fun then 30 fps or 60 fps isn't going to tip the scale. It is a bonus to have a higher frame rate, but it's a small one to the point where we're not going to drop on the ground spinning and crying in some sort of curly-the-stooge-esque tantrum over it. We aren't going to flee to the internet and complain about how the motion wasn't as entirely fluid as it could have been.

At some point, you've got to understand that some people are graphiophiles and some people aren't. I also almost entirely ignore video game sound and music as a feature and that would be problematic for audiophiles to hear. If the game drops below 30 fps then we can start to talk game damaging but as we get to 30 fps and above the benefits diminish significantly whereas the benefit from 15 fps to 30 fps is immense and from 30 fps to 60 fps is big but not so much if you view them separately and aren't a snob about this kind of thing.

Keep in mind, I have a powerhouse PC and I have consoles and not once have I ever decided to buy a game on PC just because of frame rates. It's almost always going to be because of mouse/keyboard control or the ability to mod the game. Not graphics. Or, you know, because STEAM sales.

If you said, "Do you want 30 fps or 60 fps" we would say "60 fps". It just doesn't matter as much to us as other things. For example, I may sacrifice framerate in order to have better graphics in other areas. I just won't sacrifice below 30 fps.
 

DarklordKyo

New member
Nov 22, 2009
1,797
0
0
I'd choose 60 FPS over pretty visuals if possible (in fact, I minimized the graphical options in Shadow of Mordor to get 1080p 60). I don't mind 30, 30's perfectly playable, but 60 is definitely the way to go whenever possible.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Strazdas said:
the joke is that without the fps counter on the screen, most of the fps elite could tell the fps of their games.
Tell the exact framerate - perhaps not. tell a clear difference between 30 and 60 - existing tests already prove it.

heres a quick test for you, in a videogame: http://www.30vs60fps.com/
It's pretty easy to tell things apart when they're side by side or if you've just looked at an identical example next to another. It's another thing to tell the difference when you've only got your one example and no fps counter to tell you otherwise.

However, while there was a difference between 30 and 60 (60 seemed almost like it was turning slower), that can't have been a good example because the difference was so minute and not inherently "better". There's got to be a better example than that game and spinning.

Studies have shown that overall we do tend to enjoy games with higher FPS. So it is better to have one over the other. It's just that the difference between 30 and 60 is nothing compared to the difference between 15 and 30.
 

spartan231490

New member
Jan 14, 2010
5,186
0
0
Strazdas said:
spartan231490 said:
I play games at both 30 and 60 fps every day. If they're not side by side, most people can't really tell the difference. it's just confirmation bias. Even those that can, that tiny difference isn't going to effect their experience, unless, as stated, they're playing games at a ridiculously high level.
yes, they can. The difference is massive enough to have actual gameplay benefits. you just talking a bunch of nonsense and hoping it will stick. framerate problems always plagued videogames but it usually was excused by poor hardware. we dont have that excuse anymore, there is no reason whatsoever to ever play a game at 30 fps in 2015.
Yeah, hold on to that dream. There are no significant downsides to playing at 30fps. Sure, if you get too much lower than that you'll run into issues with dropped frames and all kind of issues, but stable 30 isn't going to really be an issue. But you go ahead and keep pretending that it's a major issue, I no longer care.
 

Lightspeaker

New member
Dec 31, 2011
934
0
0
BeerTent said:
Okay, let me reiterate because your bold typeface missed a key part of that sentence... FOX NEWS.

Competitive multiplayer, is where it might matter.
Ah, insults. Great way for you to start. Actually I deliberately DIDN'T bold that part because...well...its utterly irrelevant.

All PvP multiplayer is inherently competitive. Its tautological to specify. Even vs AI you're challenging yourself alongside other people against the system. Less competitive than PvP but even so (MMO raids are a great example here). PvP in its very existence goes much, much further. To somewhat draw parallels with an earlier point I made: You personally might not care that its competitive or play it that way; but PvP is inherently about matching your abilities at a game against someone else.


On PC's where everyone is a special snowflake, having an additional 30frames will lead to a more responsive player and quicker reaction times than the player at 60fps.
Exactly. Which is the absolute start and end of my point, that is an absolute fact, so I'm rather confused why you're arguing it. Not sure where you're getting the 'special snowflake' thing from but it is an outright fact that improved frames leads to improved response time. Where's the argument again. Although you can remove the "PC" part. Its inherent to hardware setups generally. Reducing any kind of delay is only ever to the positive.


I'm a little out of it, but last I checked, I haven't seen a whole lot of consoles in a lot of those competitive gaming scenes. I've yet to hear of a clan forming for a console game, and setting up weekly practice games with other clans. I've heard of a Halo League, but was that for console, or PC? Probably PC.
Then you're misinformed. Halo, as an FPS, hasn't existed on PC since Halo 2 was a flop (most likely due to its Vista-only status). That was eleven years ago. There are plenty of competitively played titles on consoles; though there are few titles as big as the likes of CSGO, DOTA2, LoL or even SC2. I seem to recall watching an Evolve tournament played on the XBoxOne. Quite crucially many fighting games (perhaps the only type of game outside of FPS which frame rate and reduced response time is EVEN MORE IMPORTANT on) are played on various consoles.

A great example is SSBM; which to play competitively properly you need a Gamecube, a Nintendo Gamecube controller and an absolutely ancient CRT TV. I'm not a big smash fan but a while back I considered dabbling until I started looking into it and realised the startup requirements to do it properly would be a pain. Why a CRT TV over a modern flatscreen? Apparently because CRT TVs have reduced input lag compared to LCD screens and to play competitively you actually need to be damn near frame-perfect. Its that precise.


If every single player is the exact same carbon copy of each-other, locked at exactly 30 frames a second I fail to see how it's going to matter. Especially when there's no shiny gold sticker star for any of those twitchy shooters that Consoles... Can't really produce because they've got two joysticks and no mouse. FPS players adapt differently, going for a more timing based approach to putting bullets in other things.
What's your point here? Being able to put up with a poor framerate is not the same as claiming it has no effect. It DOES have an effect. This is fact. Anything beyond that is merely your personal feelings on the matter. But again, just because you don't care doesn't mean its not an issue.


Finally, I don't play multiplayer games competitively. Unless you do, this FPS debate is practically fucking moot. My machine cost me $400 and FO4 can run(Minus bugs and hitches) at 60 and look alright. Having 60 more frames than me is really giving you a biiig advantage against that dog AI.
Multiplayer is inherently competitive, as above. Just because you don't care that it is and don't play it that way doesn't invalidate that. I really don't care in the slightest how much your machine cost and whether or not it can run Fallout 4 at any particular framerate. The simple fact is that lower framerate reduces your capabilities.

Again, I'm not trying to force you to give a damn about this. I'm merely pointing out that its totally irrelevant what you think of it. Because the fact is that it matters to game performance and your abilities in playing a game which requires any kind of reactions at all; this is a hard, cold fact. It may not effect the way you choose to play to enough of an extent for you to care, and it may not be important for any one particular game, but for many it does. You can choose to ignore it or you can choose to support moves to make 60FPS standard. What you cannot do, because it is factually incorrect, is to state it has no effect at all.

Lets just try for an analogy here. Football pitches are very carefully maintained because a poor quality pitch can result in all kinds of problems. Sure you CAN technically play football in a damn marsh if you really want to, everyone struggling about and kicking the ball through water and mud. But you're never going to be as good there as you can be on a well maintained pitch. And to argue that there's no difference is just factually incorrect.

Or if you want a more computer-based example, here's another that you probably don't care about: mouse settings. Mouse settings are absolutely crucial to any kind of competitive game which involves reaction times. Skipping over anything more complicated on a basic level you need mouse acceleration off, you need an appropriate level of sensitivity set in the game and you need to marry that with an appropriate DPI setting for your mouse which, for absolutely optimal setup, you need to use a native DPI rather than a scaled one. Any person getting into CSGO in any kind of serious "I want to learn this game" way will need to sit down for a few hours and tinker with their mouse settings to get it set up right and keep trying it over and over until they're comfortable with it and are sure its not skipping pixels. This is a process that will improve your capabilities in every game you play if you take the time to set it up right. Are you likely to give a damn about ANY of that when all you're doing is running around Fallout 4 blasting everything in sight? No. Does it give you a very real, factual advantage to do so? Yes. You might not care about it, but it is factually beneficial to the player who does it.


spartan231490 said:
Yeah, hold on to that dream. There are no significant downsides to playing at 30fps. Sure, if you get too much lower than that you'll run into issues with dropped frames and all kind of issues, but stable 30 isn't going to really be an issue.
It is factually inferior. So...yeah, there are significant downsides. Anyone who plays a competitive game in any kind of serious way (i.e. to the point of actually learning and trying to improve at it) is going to find improved framerate beneficial.

Try asking JW or kennyS or GuardiaN if they'd mind trying to AWP at half the framerate because there's "no significant downside". They'd probably just laugh.
 

DanteLives

New member
Sep 1, 2011
267
0
0
Kingjackl said:
I recall hearing the Last of Us remastered did give you a choice between 60 and 30fps (the original game only had 30). Most remasters of last-gen games tend to sit at 60, with a few exceptions that are generally regarded worse for it.
Infamous Second Son and First Light have this option too.
 

babinro

New member
Sep 24, 2010
2,518
0
0
Giving players customization options is the REAL next generation of gaming.

I agree with the OP that this would be great choice to give the player. Of course I'm of the mind set that every single game released should give you the kind of options you have in a Civilization or EA Sports title.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
BeerTent said:
When I talk Competitive multiplayer, I'm talking THIS! [http://www.godisageek.com/wp-content/uploads/ESL-One-Katowice-.jpg]

I'm not talking about this. [https://haloreachinfo.files.wordpress.com/2010/04/matchmaking.jpg] Or this, [http://cdn-static.gamekult.com/gamekult-com/images/photos/30/50/17/19/payday-2-screenshot-ME3050171967_2.jpg] Or going to the local EBgames to win one of these [http://www.jeancoutu.com/Global/Cartes-prepayees/ebgames-bouquet.png] once every 2 years.

This is why I say, we need that higher frame rate in competitive multiplayer. Because you need just about every edge you can get. Even the tiniest ones. But maybe this is an antiquated PC player view, where we all played DoD in Mom's basement, hoping to win STA.

Next up, going with OP's suggestion of having options for graphics vs options for framerate. It's just not going to fly on consoles. Those machines are no more powerful than mine and that's goddamn pitiful. When you're building for the general audience, you want your game to be as pretty as humanly possible, but when you're building for the cheapist budget computer, corners need to be cut. Not to mention most console players don't want more options. They want to just press X for new game and go. I can't stand consoles because of the controller, and I couldn't remap my keys. Why spend more money and time when we can just have one uniform decision in regards to graphics?

Which brings me to multiplayer. We can all agree, that the person running at 10fps on any game is going to get his ass kicked by other players. What if we played Halo, and because you went to the options and set your framerate to 60, and started rolling all of the 30 FPS players? We're separating the community in two. Those who want visuals may not want to play with those who have a slightly better reaction time.
While there is no way Console tournaments can match Katowice numbers (it was, after all, the largest tournament ever) there certainly exists tournaments for console gamers.

for example This [http://www.majorleaguegaming.com/blog/xbox-one-default-console-online-2k-tournaments] or This [http://www.majorleaguegaming.com/news/console-update-for-call-of-duty-ghosts-tournament-at-the-mlg-championship-in-anaheim-june-20-22] even Red Bull [http://www.redbull.com/en/esports/stories/1331686214942/ps4-and-xbox-one-committing-to-next-gen-esports] got in on it. Console tournaments just happen to be much smaller because they are isolated to console players only, since when they mix it ends badly for console gamers [http://www.dailydot.com/esports/titanfall-exertus-free-refills-pc-console/]

So yes, there is definately a competetive scene on consoles. Its not as big because the player count is smaller, but it exists.

You are correct that it would be a tradeoff of graphics for framerate, but there are many people who would welcome such a tradeoff. and it being optional would mean there is no loss for those that want no tradeoff. But perhaps you are right, maybe console players want to be limited and having no choices (they have after all bought a console).

If they dont want to play with people who choose higher framerate tough shit. thats their choice. like i said, we shouldnt bring everyone down just because one person is down. bring that person up instead.

You are working under a false assumption that higher framerate benefits only competetive gamers. in reality however it benefits everyone, even those claiming they cant see the difference. because the difference is not seen, it is felt. the game controls more smoothly even if your TV can only display 30hz visuals.

Lightknight said:
It's pretty easy to tell things apart when they're side by side or if you've just looked at an identical example next to another. It's another thing to tell the difference when you've only got your one example and no fps counter to tell you otherwise.

However, while there was a difference between 30 and 60 (60 seemed almost like it was turning slower), that can't have been a good example because the difference was so minute and not inherently "better". There's got to be a better example than that game and spinning.

Studies have shown that overall we do tend to enjoy games with higher FPS. So it is better to have one over the other. It's just that the difference between 30 and 60 is nothing compared to the difference between 15 and 30.
The claim was that its not possible to tell the difference in a game, which that test disproves. there are more interesting tests, like ones that do blind testing. in fact Overwhelming majority not only sees but prefers 120 hz [http://techreport.com/news/25051/blind-test-suggests-gamers-overwhelmingly-prefer-120hz-refresh-rates]. heres the key part:
The results were pretty conclusive: 86% preferred the 120Hz setup. Impressively, 88% of the subjects were able to correctly identify whether the monitor was refreshing at 60 or 120Hz.
. And the difference between 30 and 60 is far more jarring than between 60 and 120.

The reason you may think the 60 was turning slower was because it was turning smoother and there was less distance between frame 1 and frame 2 because there were twice as many frames. I think you said you game on PC. im sure you got plenty of games where you can easily test the frame limits and can pick any game you want without having to blame me for picking a "Game where it doesnt matter".

There was never a discussion here about 15 vs 30. if course that difference is larger. The point is that the difference between 30 and 60 is still large.

spartan231490 said:
Yeah, hold on to that dream. There are no significant downsides to playing at 30fps. Sure, if you get too much lower than that you'll run into issues with dropped frames and all kind of issues, but stable 30 isn't going to really be an issue. But you go ahead and keep pretending that it's a major issue, I no longer care.
if there are no downsides, why does overwhelming majority prefer higher framerate? why is there a significant, measurable upside to higher framerate? why does some people get nauseous from 30 fps? Oh, turns out that you just keep repeating falsehoods again.
 

medv4380

The Crazy One
Feb 26, 2010
672
4
23
A Fork said:
medv4380 said:
I'm confused whether high fps is good or not. If we are getting more mud at 30 fps, wouldn't it be better to keep it above 60 as often as possible? I thought the benefit of having high fps above 60 is that the frame on the monitor on every refresh is less delayed. So if I can run at 200 fps but cap it at 60, this is better than capping it at 30, or should I not cap it at all? Or are you saying that increasing the settings causes the mud?
Sorry for the delay in responding. Got a serious bug the knocked me down for a bit.

Understanding the cause of the seen improvements addresses your confusion. The cause of the improvement is the reduction of conflict between the two threads.

The first solution is the best solution, and that is the design the sample thread to always complete before the graphics thread is done. This, when done right, reduces the conflict to only edge cases. Then you can ether remove the possibility of the edge case, player pulling 100 zombies, or just live with the edge case as a funny thing that crashes the game. Accounting for all edge cases requires a lot of QA that just isn't practical in complex games.

The first solution is what consoles do best. Because the hardware is static it's possible to figure out an optimal setting so the sampling thread says under the graphics thread 95% of the time. However, when you have a variable piece of hardware like a Hard Drive on the PS3 you can still miss something when you test with much better Hardware than might actually be out in the wild.

PC's have a hard time with this solution because the hardware is highly variable. Do you optimize for the 2.3 Gigahertz Intel which would satisfy most PC gamers, or do you target 3 to satisfy the upper third of PC Gamers? Do you code for 2 Cores which would satisfy nearly 100% of the market or 4 cores which is the upper half of the market? But that's just the sampling thread and the CPU. This ensures that if you cut too cleanly you're going to have the upper half of the PC market chanting consoles are holding them back because it couldn't possibly be those 400 dollar PC builds they encouraged their friends to make.

Cut to satisfy 100% of the market and the game will look worse than it does on console. Cutting also creates another issue, and that's adding more power won't make it look better.

So the second solution becomes to just though more hardware at it until it works. Which is where the PC thrives.

The 3rd solution is the real solution, but isn't practical. And that is to implement Real-Time Computing where processes have a guaranteed time of execution. RTC systems are overly expensive, and are more for the safety systems in your car. This is also why this issue exists because it's expensive, impractical, and a huge performance hit to have hardware, and software work in 100% consistent predictable way.