Er, it doesn't take that much to bring the mightiest GPUs of this generation to their proverbial knees. Extremely high resolutions and new anti-aliasing methods, as well as lighting/shadowing techniques like ambient occlusion, are more than capable of slowing down a single Titan's frame rate to around 50 per second. Here's a benchmark for Crysis 3:Ukomba said:Not that game will scale to match the potential of the higher graphics cards. Game makers tend to produce what the consoles can run so your better graphics cards will usually end up being overkill. And unless there's a breakthrough in production method, game companies can't afford to take advantage of the titan level graphical capability.
Arguably from the first to the sixth generation of consoles, was painfully clear on some generations like the first and the fifth. So... most of the time?DarkhoIlow said:Never again? Care to tell me exactly when did consoles surpassed PC in terms of graphical fidelity? Yeah..exactly.
We just have to keep repeating ourselves in the hopes that the lower breeds will eventually join us in PC enlightenment.Ed130 said:I think the phrase 'no shit' springs to mind here.
So PC gamers are the equivalent of hot-rod enthusiasts, racing aficionados and the Top Gear Team?Andy Chalk said:No idea, "a third" just sounded good.Ed130 said:Do you really think the Titan will depreciate that much in 3 years Mr Chalk?But three years is quite a bit of time for a GPU to depreciate, especially one at the very top end of the price range.
As for the "no shit" part of the story, sure, it goes without saying that a bleeding-edge PC will smoke any console on the market, but I find it interesting because it reminds me a bit of auto racing: The innovation takes place at the high end, but it's the mass-market, consumer-level stuff that ultimately enjoys the benefit.
While this is just Nvidia getting ahead of a bunch of "optimized for AMD" multiplatform games that will be coming over the next several years, the way integrated graphics are advancing in three years time integrated graphics will likely be on par with the hardware of the Xbone and PS4. Check out the benchmarks and specs on AMD's upcoming Kaveri APU, even that is beginning to approach what is in the next gen consoles.Andy Chalk said:No idea, "a third" just sounded good.Ed130 said:Do you really think the Titan will depreciate that much in 3 years Mr Chalk?But three years is quite a bit of time for a GPU to depreciate, especially one at the very top end of the price range.
As for the "no shit" part of the story, sure, it goes without saying that a bleeding-edge PC will smoke any console on the market, but I find it interesting because it reminds me a bit of auto racing: The innovation takes place at the high end, but it's the mass-market, consumer-level stuff that ultimately enjoys the benefit.
Still Life said:Hazy992 said:EDIT: Better put in a disclaimer that you don't need a Titan or upgrade yearly to play PC games. It's an extreme example.
I don't really see a Titan as a gaming GPU, really - It's overkill to buy that *just* for gaming, unless you're a hardware enthusiast/overclocker. I feel like the mid-range cards are usually more than adequate for most customers.
It however makes for one hell of a CGI card!Thoralata said:Hey NVidia! You might want to remember that the kind of tech you produce, nobody actually needs. Mid-range cards will run absolutely everything just fine. You're audience for Titan are a tiny number of hardcore technology freaks who are dumb enough to spend $400 on a graphics card.
Still Life said:I don't really see a Titan as a gaming GPU, really - It's overkill to buy that *just* for gaming, unless you're a hardware enthusiast/overclocker. I feel like the mid-range cards are usually more than adequate for most customers.
But... but I could run Mount and Blade with max unit locks disabled... battles with 3000 fully animated soldiers all moving and fighting at the same time... the sheer scale of it... hundreds of arrows in the air all the time, spear walls that function like actual spear walls. The ability to see 1000 mounted Kergits all lined up coming at me..Thoralata said:Hey NVidia! You might want to remember that the kind of tech you produce, nobody actually needs.
Right, because a 4K resolution is really necessary right now ¬_¬Raiyan 1.0 said:Hazy992 said:EDIT: Better put in a disclaimer that you don't need a Titan or upgrade yearly to play PC games. It's an extreme example.Actually, benchmark tests of of CoH 2 on a 4k monitor have trouble keeping a steady 60FPS at full settings with 2 Titans in SLI.Still Life said:I don't really see a Titan as a gaming GPU, really - It's overkill to buy that *just* for gaming, unless you're a hardware enthusiast/overclocker. I feel like the mid-range cards are usually more than adequate for most customers.
Uff word. I just about balked at the thought of 1000w.Johnson McGee said:My 550W PC already turns my room into a pressure cooker in the summer so I don't find the prospect of a 1000W very appealing. It would make a nice winter space heater though.
considering consoles and PCs are both going to be using x86 now, i would say the optimization gap will be far lower.Adam Jensen said:We know. And we also know that PC games will never be as optimized as console games and they will always require more raw power to run games at console settings. And that's mostly because of bloated operating systems that are not designed specifically for gaming. Windows still has a shitty bloated kernel and that won't change as long as Microsoft has practically a monopoly on desktop operating systems.
And then they wonder why their electricity bills are so high.....mattaui said:Huh, I've had a 1000W PSU in my box for the last three or four years because it was just a bit more than the 750 I was looking at, and I wanted to make sure I didn't have to worry about a new PSU for awhile.
And you dont have to. If you buy medium-to-high hardware now, you will easily last for 4-5 years without anything changed.PoolCleaningRobot said:Realistically, no matter how much I may want to upgrade computer hardware every year, I can't afford it.
Keyword: WAS.RikuoAmero said:"Microsoft simply can't afford to spend that kind of money"
Did someone actually say that? Say that MICROSOFT can't afford to spend money? The company whose founder was the richest man on Earth for several years running?
BUt crysis was pretty damn optimized. I bought it and palyed it back in 2008 on a laptop on high settings. no other game[footnote]talking about AAA only here[/footnote] released at same date forward i could play on high on that machine without framerate problems. If anything, i found it to not be very demanding.Ulquiorra4sama said:Crysis springs to mind due to the fact no one actually bought that game, but rather downloaded it and used it to test their new drive cores.
BUt COH 2 on full settings with a 4k monitor is not something 99.999% of gamers are ever going to even attempt. and even then, the FPS can spike down to 30 FPS and still keep the game playable.Raiyan 1.0 said:Actually, benchmark tests of of CoH 2 on a 4k monitor have trouble keeping a steady 60FPS at full settings with 2 Titans in SLI.
And you have to remmeber that our mid range cards that run abosolutely everything were the tech for hardcore technology freaks 3-4 years ago. but now it became standart. and so Titan will become standart one day, and we will have a "collosus" card that we will consider for freaks.Thoralata said:Hey NVidia! You might want to remember that the kind of tech you produce, nobody actually needs. Mid-range cards will run absolutely everything just fine. You're audience for Titan are a tiny number of hardcore technology freaks who are dumb enough to spend $400 on a graphics card.
your link doesnt work but i tracked the image down and its in here: http://www.pcgameshardware.de/Grafikkarten-Grafikkarte-97980/Tests/Test-Geforce-GTX-Titan-1056659/5/romxxii said:Here's a benchmark for Crysis 3:
And here i was hoping it was in real timeSaulkar said:This animation took 3 hours to render on a titan
T'was my point bro :/Strazdas said:And you dont have to. If you buy medium-to-high hardware now, you will easily last for 4-5 years without anything changed.PoolCleaningRobot said:Realistically, no matter how much I may want to upgrade computer hardware every year, I can't afford it.
I always found it funny how people called 240 shit, when i happen to go by with 8600 ok.Well i do need a new one, but its over 5 years now and the new one is in the plans anyway. Thing is, even at this poor state, i still outpoerform consoles. and i plan to use my next one for 5 more. thing is, with PC you can upgrade at any time and still be "at the top". with ocnsoles, lets say if you bought an Xbox this spring, it would become obsolete in less than a year....PoolCleaningRobot said:T'was my point bro :/Strazdas said:And you dont have to. If you buy medium-to-high hardware now, you will easily last for 4-5 years without anything changed.PoolCleaningRobot said:Realistically, no matter how much I may want to upgrade computer hardware every year, I can't afford it.
Nvidia is basically saying, "why buy a console? On a pc, you get to upgrade every time we make a new Super duper gpu." Who is this supposed to appeal to? Like another in this thread, I have an Nvidia 240 and while it is shite, it's shite that can easily run any game in high def even I have to do it on the lowest settings (though usually I don't). Most pc gamers don't upgrade like that so it's not really a selling point to say "you can buy more shit when your shit gets old".
Hahah, I wish the few hours I had my gaming PC on made that big a difference in my electricity bill, but alas it does not.Strazdas said:1000W PC? why would you need a 1000W PSU anyway? pretty much every part has been getting to lower their power needs in the last couple years. 600W is more than enough unless your building a monster, and if you are gaming is the reason.
And then they wonder why their electricity bills are so high.....mattaui said:Huh, I've had a 1000W PSU in my box for the last three or four years because it was just a bit more than the 750 I was looking at, and I wanted to make sure I didn't have to worry about a new PSU for awhile.