When was a last time you played a AAA title? Because i cannot remmeber the last game where that wasnt a case. outside of very badly done console ports that have basically no meaningful settings. Alternatively its possible you just have a very weak PC so the difference is low because it cant run it either way.CrystalShadow said:I cannot remember a single PC game where the difference between minimum settings and maximum really ever amount to much more than 20 fps, maybe 30 if you're lucky...
Almost every PC gamer i know already makes this choice, so yeah, we would. But really, there is no reason we cannot have both in 2015. the tech is certainly there.CrystalShadow said:People say they prefer higher framerates, but if they saw the consequences for what games end up looking like as a result, would they really think it worth it in the end?
Yes, i admit i am biased for better gaming experience.MonsterCrit said:Nope just your bias towards bigger numbers.
Vegetables are objectively healthier than McDonalds burgers even if someone prefers to eat at McDonalds. Someones bad choices does not prevent things from being better than other things.Only to those who deem it as such. Again, a coming designed around being black and white wille look better than one that was just a desaturated colour comic. You seem to be unaware that many so called 60fps games aren't truly 60, they're 30fps games with double frames. ie they aren't showing more incremental movement frames they're just holding each frame on the screen twice as long.![]()
Also, lol, thats absolute nonsense. 60 fps games are drawing 60 different frames. in some games there are animations locked to 30 fps, and pretty much every time this happens its jarring and people complain en-masse about it. What you are talking about here is interlaced display method which has came out of fashion in the 90s an im not aware of even a single game this generation to use it (there was a few last generation on consoles, but only on consoles).
Tell the exact framerate - perhaps not. tell a clear difference between 30 and 60 - existing tests already prove it.the joke is that without the fps counter on the screen, most of the fps elite could tell the fps of their games.
heres a quick test for you, in a videogame: http://www.30vs60fps.com/
ROI is decreasing, but that dropoff is way above 60 fps line. with the 90-100 estimate i think you may be correct, but the discussion here is 30 vs 60 where the difference is very pronounced.Not in the way you think though. And remember, what I said, the ROI on that decreases. the difference in effect between 100 and 200 would basically be appreciable by maybe half a percentage of the human population, and even for them the difference is slight. In fact the joke is, sometimes having more frames makes the movement look more jerky. see since your eyes are discarding frames theres no real wway of telling which frames your eyes are dropping so if it's dropping the wrong one...every so often you can get what are perceived as weird hitches. THe truith is 100fps and above have been rather extensively studied... in the area of film production.
Btw, eyes do not "Drop" frames. humasn dont see in frames. they see constant fluid motion. its about how quickly your brain can process what you see. more frames never make the motion look more jerky unless there is bugs where the animations mess up at high framerate (like physics being tied to framerate in Skyrim).
well we know that military tests tell us that people do tell the difference. also stop shifting the goalposts, the thread is about 30 fps vs 60 fps, not 90 fps vs 120 fps.In fact i'd almost be willing to bet you couldn't tell the difference between 90fps and 120 fps. I'm willing to bet that you and most others placed under double or reverse blind conditions wouldn't know the difference.
yes, i can, and so can you, and so can everyone with correctly functioning vision. its more like the 0.0001% who have vision problems are the ones that cannot tell the difference. ive played various games at various framerates from as low as 12 to as high as 144. i know the differences. and there is plenty.spartan231490 said:Unsurprisingly, you missed the point. Yes, it's "objectively" better, but it doesn't matter because you can't subjectively tell a substantial difference. Sure, if you're one of the .0001% of gamers who plays reaction based games at a professional level, that 60 fps is gonna matter. For the rest of us, it just doesn't. If you got out of your own way and let go of your confirmation bias you wouldn't have any problems whatsoever playing at the same level and same experience at 30fps.