Ultratwinkie said:
CosmicCommander said:
Better question- who is willing to pay £500 for some tech that only marginally performs better (in real terms) over it's apparently "outdated" predecessors?
Nvidia and AMD are getting nervous- graphics on the PC are plateauing, and people are moving over to consoles, depriving them of a big money milker. So they just treat every card as if it's revolutionary, despite no real person being able to notice or care about any difference.
PS: Who the fuck does SLI or that Crossfire thing, anyway? Why would you need 2 cards, unless your monitor was the size of a shed?
No. people are running AWAY from consoles now. Its getting too costly, and its holding back technological development. Consoles are like children, if they hold onto a pole and PC gaming can't pry them off then they wont go anywhere. So the oldest tech now outperforms consoles, and makes the 500$ cards USELESS because they never get used to their full potential.
Actually, game publishers use that as an excuse to release shoddy ports of their console games on PC hoping that the extra horsepower makes up for their slip-shod coding and porting.
I built my rig back in late 08', and it plays well ported games very well. Mass Effect 2, Batman: Arkham Asylum, FarCry 2, Dragon Age: Origins, Just Cause 2, Assassin's Creed 2, Rainbow Six VEGAS 2, Fallout 3 & New Vegas, and Left 4 Dead 2 all run beautifully.
However it has trouble with stuff from Volition (Red Faction Gurellia and Saint's Row 2, both of which are notoriously terrible port jobs). GTA IV has trouble, but works. But the worst offender in the Call of Duty franchise. The most recent one, Black Ops, uses the same engine from Modern Warfare. MW runs like a dream, BO runs like absolute shit. Both games look and run beautifully on the 360. But the PC specs for the game have skyrocketed, and the amount of extra power that the PC requires does not match the extra visual fidelity. Don't get me wrong, I'm pretty sure BO looked a little better, but it was hard to tell with the game's frame rate hiccuping every 20 seconds or so. Regardless, the increase in power is substantial, the increase in visuals is not.
Call of Duty: Modern Warfare
System Requirements
Required (minimum) Specs
CPU: Intel(R) Pentium(R) 4 2.4 GHz or AMD(R) Athlon(TM) 64 2800+ processor or any 1.8Ghz Dual Core Processor or better supported
RAM: 512MB RAM (768MB for Windows Vista)
Hard Drive: 8GB of free space
Video card: NVIDIA(R) Geforce(TM) 6600
or better or ATI(R) Radeon(R) 9800 Pro or better
Recommended Specs
CPU: 2.4 GHz dual core or better
RAM: 1G for XP; 2G for Vista
Hard Drive: 8GB of free space
Video card: 3.0 Shader Support recommended. Nvidia Geforce 7800 or better or ATI Radeon X1800 or better
OR get an Xbox 360.
Call of Duty: Black Ops
MINIMUM System Requirements:
? OS: Windows XP, Vista, 7
? CPU: Dualcore Intel 3 GHZ or AMD 6500+
? RAM: 1GB
? HDD: 12GB of free space
? VIDEO: Shader 3.0 or better; 256MB NVIDIA GeForce 8600GT DirectX 9.0c or better / Ati x1900 or better
? SOUND: DirectX 9.0c-compatible
? DirectX: 9.0c
Recommended system requirements:
? Intel Processor - Quadcore Intel 2.6 ghz
? AMD Processor - Phenom II 955
? Nvidia Graphics Card - DirectX GeForce GTX 260
? ATI Graphics Card - ATI Radeon HD 5800 with 512 MB VRAM ? DirectX 10
? RAM Memory - 2 GB
? Hard Disk Space - 12 GB
? Direct X - 9
OR get an Xbox 360.
It's the same engine running on the same console, but the specs for the PC requirements really jump. And that's bullshit, I can understand a slight increase, but this is just terrible. The extra visual fidelity is lost because I have to turn down all of the options to get the game running somewhat decently. I can play MW with everything maxed (and Crysis with everything on High in DX9 mode, so it's still no slouch), so MW ends up looking a hell of a lot better. That's crap, and it's why I'm not buying anything else from Activision until they shape up their act, because their current treatment of the PC community can be summed up as 'willful neglect'.