icame said:
People always talk about the superiority of PC hardware when going through the whole PC vs Console debate. Personally I game on both so I don't care, but I realized something while playing Uncharted the other day. That game is comparable in quality of graphics to many of the pc's top contendor's for that title (Not saying it's better at all. Just saying that it is comparable) despite being run on at least 5 year old hardware. It got me thinking about whether PC hardware isn't being used to its potential because they don't have to work within any boundaries, discovering exactly what is the limit of that GFX card/CPU/Whatever.
Anyone else have an opinion on this?
I think the short answer is, well two things. A. PC Hardware probably isn't completely used to its full potential ever, and might never be, but we can only hope we make greater hardware and find ways to utilise that hardware more.
B. The gap for the same game, between PC and Console should be marginally comparable, if you have a PC with great graphcs however, you should be able to turn all the settings down and go 'ah, almost PS3/360 like' then up and go 'ah, very realistic', the games are either ported or made to work with the lack of physical power, but remember, these consoles are deesigned to basically only do that one thing as best as they could, when they were released so, you'd hope they could keep up with PCs for about 2 years on full graphics, 2 more years on average graphics, then 2 more years on acceptable graphics, and here we are.