Idea: What if console users were given a choice in Framerate.

Recommended Videos

Yopaz

Sarcastic overlord
Jun 3, 2009
6,092
0
0
BarryMcCociner said:
It's like encountering a creationist who says "What?! You actually believe we came from monkeys?"
Encountering someone who actually has a fundamental misunderstanding of evolution saying we come from monkeys is equally bad.

OT: Given the choice I would prefer higher fps. I notice terrible framerate in console games quite often and it actually affects the gameplay. Gameplay is more important than graphics so I would love to get 60 fps and maybe tone down textures and resolution slightly. Much can be done with shading and water effect reduction since that shit is demanding.
 

Windcaler

New member
Nov 7, 2010
1,332
0
0
Keeping in mind that Im primarily a PC gamer (I had both Xbox 360 and PS3 last generation and this generation I have my PC, 3ds, and PS4 so far) I have a different point of view. From my perspective playing a game at 30 or lower fps makes a game nearly unplayable. The ps3 last of us was like that for me. The game felt sluggish and unresponsive to me which kills my desire to play a game (and that was a real shame because last of us was great). It was not till the PS4 version that I could actually play the game because that ran at 60 fps. Yeah it effected my gameplay experience

So from my perspective, yes. Yes, I do want an option to lower graphical fidelity and increase framerate to a solid 60+. I know other people dont care, and thats fine, but I care. I care because I dropped $60 on a game that I couldnt stand to play (actually more then that since other games routinely run at 30 fps). I would have been better off spending that $60 on something useful like food
 

spartan231490

New member
Jan 14, 2010
5,186
0
0
inu-kun said:
So, "I'd like shittier graphics to get a frame rate that I won't see the difference without a side by side comparison and doesn't matter anyways since I'm over a meter away from the TV"?

Care to explain the logic in it?
There is none, but that's the case for the whole 60 framerate argument. If you want that, great, but it's not really that superior, and there's nothing wrong with being happy at 30 so stop talking about it.
 

Poetic Nova

Pulvis Et Umbra Sumus
Jan 24, 2012
1,974
0
0
60FPS is great to have, bu no nescesity. As long as it is steady, 30FPS is adequate. Minus Fighting games offcourse, those games are crippled when not at a steady 60 FPS.
 

bluegate

Elite Member
Legacy
Dec 28, 2010
2,424
1,033
118
Never bothered about frame rates since the NES days and I'm not planning on starting now.

As for implementing such a system on consoles, I personally don't even want to start to imagine the ramifications for developers tasked with creating a game to run both on 30fps and 60fps on the same system.
 

Skops

New member
Mar 9, 2010
820
0
0
I'm for stable fps, or perhaps a slightly better standard of 40-45fps giving a buffer zone for games to drop to 30 fps during those heavy segments. My biggest complaint with the 30fps model is that most games have several points where the fps drops down to 20-22fps and it becomes very obvious when it does and it looks awful.

Some games don't even reach 30fps despite it being the designed fps. Borderlands 2 on the PS3, for example, RARELY ever gets to 30 fps and more often than not sits around 22-26 and even lower during parts where a shit ton of elements are going off. It was bad enough that I've forever sworn off Borderlands on console.
 

medv4380

The Crazy One
Feb 26, 2010
672
4
23
BarryMcCociner said:
only the most stubborn would still claim there's no difference and that 30 frames isn't anything but mediocre.
There is no meaningful difference and the last few decades of this argument have proven it time and time again with the entire PC vs Console argument.

The honest truth of the matter has always been and issue with something other than drawn frame rate. Constant time calculation are what happen to be important. This can be proven by simply having a game render at 100 fps on a 60 hertz monitor. It's impossible for the monitor to render more than 60 fps, but the people watching will still claim the 100 fps is smother, and that's because the time calculations are more consistent not because they have a 6th sense to see frames that are impossible to render.

Consoles on the other hand benefit from a known hardware configuration. Which means that it's easier to optimize the process so that it can work more consistently without going the PC route of throwing more power then is needed at the problem. This also makes your request impractical since it should be designed for the specific hardware tinkering with the settings would make it much worse. The only reason you get the option on PC is that there exists no configuration that would satisfy all existing PC configurations.
 

CaitSeith

Formely Gone Gonzo
Legacy
Jun 30, 2014
5,374
381
88
BarryMcCociner said:
And suddenly people realize, your fancy coat of paint isn't as important as your horsepower.
Almost every game industry study says that most people prefer the coat of paint.
 

BeerTent

Resident Furry Pimp
May 8, 2011
1,167
0
0
PC master race dude here. Im willing to sacrifice visuals for fps. When i was younger, and even today in a very select few games, i will play on low to get as high as possible. But now, its pretty gfx as long as I can get 40fps or more.

40 is functional to me. 30 is plenty fine.

Whats not plenty fine is going around and saying, "you dont care about this thing that clearly doesnt fucking matter unless its below 20? Get out, scrub!" Of all of the dumb shit I've seen today, some responses in here rank up there. Most people want to pop a disc in, double click the exec, and have fun. Not all of us are competing in our singleplayer, or even multiplayer games, where the push for a higher framerate actually matters.

Fellow master race... You... *Sigh*
 

loa

New member
Jan 28, 2012
1,716
0
0
Not gonna happen.

Some twit decreed that console games shalt not have any worthwhile customization whatsoever and that's what we're running with now.
Can't even get mandatory multiple language options.
You bought some game with a crappy dub in your language on a console and want to change it, fuck you.
Meanwhile, you can just select and download all available language files through steam for any game.

Changing graphics settings is several magnitudes above that.
I wish it would be a thing, technically it's absolutely doable but like I said, not gonna happen.
Just look at that 150$ microsoft controller that sells itself on rebindable fucking buttons, something that should be a basic feature. Yeah...
 
Nov 9, 2015
330
87
33
medv4380 said:
BarryMcCociner said:
only the most stubborn would still claim there's no difference and that 30 frames isn't anything but mediocre.
There is no meaningful difference and the last few decades of this argument have proven it time and time again with the entire PC vs Console argument.

The honest truth of the matter has always been and issue with something other than drawn frame rate. Constant time calculation are what happen to be important. This can be proven by simply having a game render at 100 fps on a 60 hertz monitor. It's impossible for the monitor to render more than 60 fps, but the people watching will still claim the 100 fps is smother, and that's because the time calculations are more consistent not because they have a 6th sense to see frames that are impossible to render.

Consoles on the other hand benefit from a known hardware configuration. Which means that it's easier to optimize the process so that it can work more consistently without going the PC route of throwing more power then is needed at the problem. This also makes your request impractical since it should be designed for the specific hardware tinkering with the settings would make it much worse. The only reason you get the option on PC is that there exists no configuration that would satisfy all existing PC configurations.
Something about this sounds incorrect, or not specific enough.

I don't know anything about computer architecture, but when you say constant time calculation, are you talking about minimizing frame latency? Are you saying that consoles are better at reducing frame latency? Basically like v-sync right?

I think frame latency and frame rate are different things. If you are playing a game at 5 fps, even if the latency is exactly 0, it will still look like a slideshow and it will be extremely disorientating.

Even if latency was your primary concern, the more frames that can be rendered means less latency as you described. If you lower the settings, the console can render more frames, and you would get lower latency.
 

Lightspeaker

New member
Dec 31, 2011
934
0
0
BeerTent said:
Whats not plenty fine is going around and saying, "you dont care about this thing that clearly doesnt fucking matter unless its below 20? Get out, scrub!" Of all of the dumb shit I've seen today, some responses in here rank up there. Most people want to pop a disc in, double click the exec, and have fun. Not all of us are competing in our singleplayer, or even multiplayer games, where the push for a higher framerate actually matters.

Fellow master race... You... *Sigh*
You claim that FPS doesn't matter 'unless its below 20' and then outright state the exact circumstances in which it DOES matter. Directly contradicting your point.

Then act all exasperated with people who are stating that it does, in fact, matter.

............really?


And will people seriously stop saying "oh its just about making the game prettier". No its not. Textures just make the game prettier. Shaders just make the game prettier. Anti-aliasing just makes the game prettier. Framerates directly affect your game performance. It DOES matter. Factually it matters and it impacts your gameplay experience. Whether you CARE that it matters is an entire other thing. But just because you don't care doesn't mean it doesn't matter. Lots of people don't care about microtransactions in full-price games and exploitative DLC practices, but that doesn't mean they don't matter and that they're not an issue worth addressing.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
MonsterCrit said:
Careful. Your bias is showing. That's like saying a comic done in black and white shit or crap because it's not colour. Again there are many games that are designed around 30fps. Though you probably don't notice it because again... if it's designed around it. Everything is tuned to deliver the optimal aesthetic experience at that frame rate.


Yup, the epeens keep getting longer. Joke is any FPS above 90 is pretty much irrelevant, the ROI above 60fps drops at a rate of n^2. Anything above 100 is just a number for all the difference it makes for the experience. And anyone who'd say otherwise is likely the sort of person who swears there car goes faster after they wax it.
My bias for better gaming experience is showing. how terrible. Comic done in black and white can be an artistic choice of a visual style. this is not true with framerate. framerate is objective measurable gameplay quality. designing your game around low framerate is designing your game around bad gameplay. the ONLY way to tune things in a way that framerate does not matter is to make it a static image. since games are not static images, this tuning is nonsense excuse for being shit developer.

Considering that some people still cannot get out of the 30 fps hell, i think we still got some time till the benefits of above 100 is to be a measure. that being said, we do know for a fact that humans do know the difference for up to 205 fps and some animals, like dogs, easily see in hundreds. in fact the 144hz TVs are loved by dogs because the 60 hz ones hurt their eyes due too too low framerate and the 144hz ones are more suited to their eyes. so if you have a dog go for 144hz!

spartan231490 said:
There is none, but that's the case for the whole 60 framerate argument. If you want that, great, but it's not really that superior, and there's nothing wrong with being happy at 30 so stop talking about it.
60 fps is objectively superior to 30 fps.

medv4380 said:
There is no meaningful difference and the last few decades of this argument have proven it time and time again with the entire PC vs Console argument.
there is a massive difference that is objectively measurable and this is not an argument. its just some people stubbornly trying to defend their poor purchasing choices and hailrcorporate showing its shilling.

A Fork said:
I don't know anything about computer architecture, but when you say constant time calculation, are you talking about minimizing frame latency? Are you saying that consoles are better at reducing frame latency? Basically like v-sync right?
V-sync increases consistency at the price of latency. v-sync increases latency due to buffering, usually increasing latency by either 2 or 3 frames (depending on what v-sync engine you use). its designed to smooth out the "bumps" in the framerate by caching a few frames ahead this a drop that exists for less time than the buffer will not be visible. as a result, they also prevent scream tearing becuase they sync with the monitor, and some game engines still do screen tearing sadly.
 

CrystalShadow

don't upset the insane catgirl
Apr 11, 2009
3,829
0
0
Thing is, I highly doubt this is feasible.

I cannot remember a single PC game where the difference between minimum settings and maximum really ever amount to much more than 20 fps, maybe 30 if you're lucky...

And that's including things where 'minimum' looks like a game that's about 20 years older than 'maximum' with the amount of effects and stuff you lose as a result...

You don't realise how much you might have to lose to make that plausible...

That's assuming nothing has been tied directly to an assumed framerate, (which on a console, is not a safe assumption)


Meh. I don't care. I've been known to choose 20 fps over 40 after looking at what the difference means in terms of graphical quality.

People say they prefer higher framerates, but if they saw the consequences for what games end up looking like as a result, would they really think it worth it in the end?
 

MonsterCrit

New member
Feb 17, 2015
594
0
0
Strazdas said:
MonsterCrit said:
Careful. Your bias is showing. That's like saying a comic done in black and white shit or crap because it's not colour. Again there are many games that are designed around 30fps. Though you probably don't notice it because again... if it's designed around it. Everything is tuned to deliver the optimal aesthetic experience at that frame rate.


Yup, the epeens keep getting longer. Joke is any FPS above 90 is pretty much irrelevant, the ROI above 60fps drops at a rate of n^2. Anything above 100 is just a number for all the difference it makes for the experience. And anyone who'd say otherwise is likely the sort of person who swears there car goes faster after they wax it.
My bias for better gaming experience is showing. how terrible.
Nope just your bias towards bigger numbers.
Comic done in black and white can be an artistic choice of a visual style. this is not true with framerate. framerate is objective measurable gameplay quality.
Only to those who deem it as such. Again, a coming designed around being black and white wille look better than one that was just a desaturated colour comic. You seem to be unaware that many so called 60fps games aren't truly 60, they're 30fps games with double frames. ie they aren't showing more incremental movement frames they're just holding each frame on the screen twice as long. :p

designing your game around low framerate is designing your game around bad gameplay. the ONLY way to tune things in a way that framerate does not matter is to make it a static image. since games are not static images, this tuning is nonsense excuse for being shit developer.
There's that bias again. Already explained that even the so-called 60fps games are juyst 30fps that leave the frames one screen for two cycles, and the joke is that without the fps counter on the screen, most of the fps elite could tell the fps of their games.


Considering that some people still cannot get out of the 30 fps hell, i think we still got some time till the benefits of above 100 is to be a measure. that being said, we do know for a fact that humans do know the difference for up to 205 fps
Not in the way you think though. And remember, what I said, the ROI on that decreases. the difference in effect between 100 and 200 would basically be appreciable by maybe half a percentage of the human population, and even for them the difference is slight. In fact the joke is, sometimes having more frames makes the movement look more jerky. see since your eyes are discarding frames theres no real wway of telling which frames your eyes are dropping so if it's dropping the wrong one...every so often you can get what are perceived as weird hitches. THe truith is 100fps and above have been rather extensively studied... in the area of film production.


and some animals, like dogs, easily see in hundreds. in fact the 144hz TVs are loved by dogs because the 60 hz ones hurt their eyes due too too low framerate and the 144hz ones are more suited to their eyes. so if you have a dog go for 144hz!
Yeah but make sure it's black and white. Remembers dogs are bichromic so it makes little sense giving them a full colour television and don't bother with 1080p the resolution of their vision is quite as sharp. Look point is, you've already stated your bias quite clearly. You reflexively assume 30fps is bad because it's 30fps...like how some peopel hold the king james bible to be the direct word of god because it is the king james bible. Your logical reasoning on this is a closed circuit.

In fact i'd almost be willing to bet you couldn't tell the difference between 90fps and 120 fps. I'm willing to bet that you and most others placed under double or reverse blind conditions wouldn't know the difference.

spartan231490 said:
There is none, but that's the case for the whole 60 framerate argument. If you want that, great, but it's not really that superior, and there's nothing wrong with being happy at 30 so stop talking about it.
60 fps is objectively superior to 30 fps.

medv4380 said:
There is no meaningful difference and the last few decades of this argument have proven it time and time again with the entire PC vs Console argument.
there is a massive difference that is objectively measurable and this is not an argument. its just some people stubbornly trying to defend their poor purchasing choices and hailrcorporate showing its shilling.

A Fork said:
I don't know anything about computer architecture, but when you say constant time calculation, are you talking about minimizing frame latency? Are you saying that consoles are better at reducing frame latency? Basically like v-sync right?
V-sync increases consistency at the price of latency. v-sync increases latency due to buffering, usually increasing latency by either 2 or 3 frames (depending on what v-sync engine you use). its designed to smooth out the "bumps" in the framerate by caching a few frames ahead this a drop that exists for less time than the buffer will not be visible. as a result, they also prevent scream tearing becuase they sync with the monitor, and some game engines still do screen tearing sadly.[/quote]
 

medv4380

The Crazy One
Feb 26, 2010
672
4
23
A Fork said:
medv4380 said:
BarryMcCociner said:
snip
Something about this sounds incorrect, or not specific enough.

I don't know anything about computer architecture, but when you say constant time calculation, are you talking about minimizing frame latency? Are you saying that consoles are better at reducing frame latency? Basically like v-sync right?

I think frame latency and frame rate are different things. If you are playing a game at 5 fps, even if the latency is exactly 0, it will still look like a slideshow and it will be extremely disorientating.

Even if latency was your primary concern, the more frames that can be rendered means less latency as you described. If you lower the settings, the console can render more frames, and you would get lower latency.
I'm talking about the time dependent calculations vs the actual drawing of a frame. You can write them seperatly, and in the old school days they are in the same loop.

Time dependent variables are like the polling of buttons, collision detection, and much of the basic physics of moving. This can easily be with only 5 samples per second, but for simplicity sake it still gets called frames per second. If each sample generates 6 frames of animation then you get 30 frames per second.

Now when your 30 frames per second drops and isn't able to generate that 6 frames you get dropped frames so that the sampling can be done. In the old method they'd be in the same loop so the slow down of one clearly slows down the other. However, even when you thread them separately they still slow each other down because one thread has to communicate with the other the shared memory will lock one or the other in order to pass data. A small drop isn't very noticeable, but if you drop from 30 to 20 your sampling will have dropped from 5 to 4, and to compensate TIME will have to slow down in game in order to process the movement correctly. Otherwise you end up with very old school games where frame rate can affect the physics. This causes the frame rate to feel like you're moving though mud.

The other issue is when the sampling thread slows down but the graphics thread can keep up. This doesn't make you feel like you're moving though mud but rather that things are just jumping though frames. Unfortunately this can also be caused on online games by network latency, and can be thought of as just another thread in the mix to mess things up.

The evidence is all in the original post when people claimed that they could see the difference on 100 fps, and it was actually done as a study so the respondents didn't actually know which frame rate they were looking at. They just consistently said 100 fps was better, and smoother than the lower frames rates. This was long ago when the best monitor you could hope for was 60 hertz so it was a physical impossibility for them to see the difference, and so the people doing the study got laughed at because they claimed that the frame rate was visible, but they didn't know that it was impossible to see with the hardware. Sure there is a difference, but it has nothing to do with the frames, and everything to do with a background process that can affect the frames.


If you have a 60 hertz monitor you can test it yourself. Get a game that reports that it's going much faster than your refresh, but still not give you screen tearing, and see how smooth it is compared to 60 fps, or even 30 if you can crank up the specs to slow it down. You'll have a sense of moving through mud at 60 fps vs 90 vs 120fps when there is no physical way for your monitor to display half the frames.
 

Kingjackl

New member
Nov 18, 2009
1,041
0
0
I recall hearing the Last of Us remastered did give you a choice between 60 and 30fps (the original game only had 30). Most remasters of last-gen games tend to sit at 60, with a few exceptions that are generally regarded worse for it.
 

spartan231490

New member
Jan 14, 2010
5,186
0
0
Strazdas said:
spartan231490 said:
There is none, but that's the case for the whole 60 framerate argument. If you want that, great, but it's not really that superior, and there's nothing wrong with being happy at 30 so stop talking about it.
60 fps is objectively superior to 30 fps.
Unsurprisingly, you missed the point. Yes, it's "objectively" better, but it doesn't matter because you can't subjectively tell a substantial difference. Sure, if you're one of the .0001% of gamers who plays reaction based games at a professional level, that 60 fps is gonna matter. For the rest of us, it just doesn't. If you got out of your own way and let go of your confirmation bias you wouldn't have any problems whatsoever playing at the same level and same experience at 30fps.
 
Nov 9, 2015
330
87
33
medv4380 said:
I'm talking about the time dependent calculations vs the actual drawing of a frame. You can write them seperatly, and in the old school days they are in the same loop.

Time dependent variables are like the polling of buttons, collision detection, and much of the basic physics of moving. This can easily be with only 5 samples per second, but for simplicity sake it still gets called frames per second. If each sample generates 6 frames of animation then you get 30 frames per second.

Now when your 30 frames per second drops and isn't able to generate that 6 frames you get dropped frames so that the sampling can be done. In the old method they'd be in the same loop so the slow down of one clearly slows down the other. However, even when you thread them separately they still slow each other down because one thread has to communicate with the other the shared memory will lock one or the other in order to pass data. A small drop isn't very noticeable, but if you drop from 30 to 20 your sampling will have dropped from 5 to 4, and to compensate TIME will have to slow down in game in order to process the movement correctly. Otherwise you end up with very old school games where frame rate can affect the physics. This causes the frame rate to feel like you're moving though mud.

The other issue is when the sampling thread slows down but the graphics thread can keep up. This doesn't make you feel like you're moving though mud but rather that things are just jumping though frames. Unfortunately this can also be caused on online games by network latency, and can be thought of as just another thread in the mix to mess things up.
Thanks for the detailed explanation. So by mud, you mean if I throw a grenade at a bunch of physics enabled objects, it turns into a slideshow and takes forever, or slows down I guess.

And the other case is I throw a grenade at a barrel, the framerate lowers for whatever reason and the barrel's movement "jumps" or doesn't look as smooth because of delta time.

medv4380 said:
The evidence is all in the original post when people claimed that they could see the difference on 100 fps, and it was actually done as a study so the respondents didn't actually know which frame rate they were looking at. They just consistently said 100 fps was better, and smoother than the lower frames rates. This was long ago when the best monitor you could hope for was 60 hertz so it was a physical impossibility for them to see the difference, and so the people doing the study got laughed at because they claimed that the frame rate was visible, but they didn't know that it was impossible to see with the hardware. Sure there is a difference, but it has nothing to do with the frames, and everything to do with a background process that can affect the frames.


If you have a 60 hertz monitor you can test it yourself. Get a game that reports that it's going much faster than your refresh, but still not give you screen tearing, and see how smooth it is compared to 60 fps, or even 30 if you can crank up the specs to slow it down. You'll have a sense of moving through mud at 60 fps vs 90 vs 120fps when there is no physical way for your monitor to display half the frames.
I'm confused whether high fps is good or not. If we are getting more mud at 30 fps, wouldn't it be better to keep it above 60 as often as possible? I thought the benefit of having high fps above 60 is that the frame on the monitor on every refresh is less delayed. So if I can run at 200 fps but cap it at 60, this is better than capping it at 30, or should I not cap it at all? Or are you saying that increasing the settings causes the mud?