Every new console generation, graphics take a jump ahead of PCs. They'll have similar hardware, but the developers on console (especially during the second cycle on any given platform) are able to optimise to a particular feature set. Then, PCs catch up, and about halfway through the generation the consoles are left in the dust.
Put it this way: Red Dead Redemption could have been released in 2005. Had it been programmed then, it would've worked as a launch title. And it would've looked better than anything else out there at the time. Then, in 2007, we got Crysis - that's still the best-looking game on the market, although the mass market is drawing level with it, and as of 2008-09, it became practical to get a computer to run it on.
However: Launch titles are generally not as good as what comes later. Compare Call of Duty 2 to Call of Duty 4, for instance. Same platform, HUGE difference in graphics. As people use a system, they learn how to make better looking games. It's that simple.
So, in summary, consoles and PCs play leapfrog in capabilities, not so much in actualities.
And as an afterthought, graphics are plateauing. Age of Empires 3 looks just about as good as StarCraft 2, despite them being released FIVE YEARS apart. (RTSs plateaued in 2005 on PC) And there's no real reason to make graphics better than Crysis' graphics, is there? (FPSs plateaued in 2007 on PC)
The graphics arms-race isn't over, but it's in the final few battles. I predict maybe one more console generation, in 2012 or so, unless they find some new dimension of graphical focus to work on. Screenshots from this new generation, crucially, will not wow gamers very much. They'll look almost the same as current-gen screenshots, just with the rough edges removed. Then we'll all settle down and make games based on variety, not on looking better. And I will be happy.
Put it this way: Red Dead Redemption could have been released in 2005. Had it been programmed then, it would've worked as a launch title. And it would've looked better than anything else out there at the time. Then, in 2007, we got Crysis - that's still the best-looking game on the market, although the mass market is drawing level with it, and as of 2008-09, it became practical to get a computer to run it on.
However: Launch titles are generally not as good as what comes later. Compare Call of Duty 2 to Call of Duty 4, for instance. Same platform, HUGE difference in graphics. As people use a system, they learn how to make better looking games. It's that simple.
So, in summary, consoles and PCs play leapfrog in capabilities, not so much in actualities.
And as an afterthought, graphics are plateauing. Age of Empires 3 looks just about as good as StarCraft 2, despite them being released FIVE YEARS apart. (RTSs plateaued in 2005 on PC) And there's no real reason to make graphics better than Crysis' graphics, is there? (FPSs plateaued in 2007 on PC)
The graphics arms-race isn't over, but it's in the final few battles. I predict maybe one more console generation, in 2012 or so, unless they find some new dimension of graphical focus to work on. Screenshots from this new generation, crucially, will not wow gamers very much. They'll look almost the same as current-gen screenshots, just with the rough edges removed. Then we'll all settle down and make games based on variety, not on looking better. And I will be happy.