I'm curious how my PC's capabilities compare to those of the current-gen consoles?

Recommended Videos

Baron Teapot

New member
Jun 13, 2013
42
0
0
RicoADF said:
EXos said:
Not really. Otherwise they would change this on PC.
If GDDR5 was better to run a system on they would have changed it. But GDDR5 is great to handle a big packet of data for a single purpose. Multiple different application using the same chip will build up latency.
I believe he was actually referring to how consoles don't have extra software and other stuff running on them using up processing power, ram etc. A console is built with one role in mind, gaming, and while they do other things like watch movies etc generally they don't do them while gaming and anything that's done alongside it has been designed around (eg: the downloading updates/games in the background on the PS4 is handled by a separate chip specifically built for that purpose and thus keeps the load off the main system). As a result a game can utilize the whole potential of the hardware in question where as games on a PC have to compete with other software (including a lot of redundant stuff running in Windows).

That said the hardware quoted by the OP will easily survive for quite a while, only upgrades I can see coming is maybe some more ram in a few years and then a few years later possibly a new video card if you want to keep graphics as high/ultra, otherwise it should last you most of if not all of this gen.
Hah! I'm exaggerating, I know, but this doesn't seem to apply to the XBox One, whose purpose was explained away as some sort of dedicated television-supplement.

It may be low-hanging fruit, but it frustrates me to see passionless morons wasting the time and energy of talented engineers, and having worked on a console for months their reward was to have Don Mattrick and the Microsoft PR team ruin it in a series of idiotic blunders. I just want to say that I'm glad he's no longer in charge of anything relevant to gaming. Bored housewives and unemployed stoners tend to be far less discerning, in my opinion, which is fine.

Zipa said:
Baron Teapot said:
Well Win 7 users might get access to DX12 though I guess Microsoft are likely to continue to hold it to ransom by only releasing it with Windows 8.1.1 (or whatever they call the upcoming update) or the upcoming 9. Still though hopefully OpenGL or Mantle can replace it finally now that people like Intel, Nvidia and AMD are putting their heads together.
OpenGL 4 already allows you to do this. There's a piece of software called TessMark that will determine your computer's ability to use the tessellation shaders, which will give you some sort of inkling as to how well games that use that technology will run on your system.

Buying a PC is, sadly, not very cheap, but when compared to one of the recently-released games consoles, you'd likely do well to spend the money on a decent gaming machine that will see you through the next several years. I'm actually a little worried that consoles might have had their day; this is the first time I've felt largely indifferent to a console release, and I can't tell whether that's because I've changed since the original Xbox was released, or the numerous restrictions (and especially the lack of backwards-compatibility) that the new consoles impose have started to seem so toxic and unappealing that I want nothing whatsoever to do with them.

It's odd.
 

Zipa

batlh bIHeghjaj.
Dec 19, 2010
1,489
0
0
Baron Teapot said:
RicoADF said:
EXos said:
Not really. Otherwise they would change this on PC.
If GDDR5 was better to run a system on they would have changed it. But GDDR5 is great to handle a big packet of data for a single purpose. Multiple different application using the same chip will build up latency.
I believe he was actually referring to how consoles don't have extra software and other stuff running on them using up processing power, ram etc. A console is built with one role in mind, gaming, and while they do other things like watch movies etc generally they don't do them while gaming and anything that's done alongside it has been designed around (eg: the downloading updates/games in the background on the PS4 is handled by a separate chip specifically built for that purpose and thus keeps the load off the main system). As a result a game can utilize the whole potential of the hardware in question where as games on a PC have to compete with other software (including a lot of redundant stuff running in Windows).

That said the hardware quoted by the OP will easily survive for quite a while, only upgrades I can see coming is maybe some more ram in a few years and then a few years later possibly a new video card if you want to keep graphics as high/ultra, otherwise it should last you most of if not all of this gen.
Hah! I'm exaggerating, I know, but this doesn't seem to apply to the XBox One, whose purpose was explained away as some sort of dedicated television-supplement.

It may be low-hanging fruit, but it frustrates me to see passionless morons wasting the time and energy of talented engineers, and having worked on a console for months their reward was to have Don Mattrick and the Microsoft PR team ruin it in a series of idiotic blunders. I just want to say that I'm glad he's no longer in charge of anything relevant to gaming. Bored housewives and unemployed stoners tend to be far less discerning, in my opinion, which is fine.

Zipa said:
Baron Teapot said:
Well Win 7 users might get access to DX12 though I guess Microsoft are likely to continue to hold it to ransom by only releasing it with Windows 8.1.1 (or whatever they call the upcoming update) or the upcoming 9. Still though hopefully OpenGL or Mantle can replace it finally now that people like Intel, Nvidia and AMD are putting their heads together.
OpenGL 4 already allows you to do this. There's a piece of software called TessMark that will determine your computer's ability to use the tessellation shaders, which will give you some sort of inkling as to how well games that use that technology will run on your system.

Buying a PC is, sadly, not very cheap, but when compared to one of the recently-released games consoles, you'd likely do well to spend the money on a decent gaming machine that will see you through the next several years. I'm actually a little worried that consoles might have had their day; this is the first time I've felt largely indifferent to a console release, and I can't tell whether that's because I've changed since the original Xbox was released, or the numerous restrictions (and especially the lack of backwards-compatibility) that the new consoles impose have started to seem so toxic and unappealing that I want nothing whatsoever to do with them.

It's odd.
OpenGL can do it yes but the companies are trying to make it a lot faster than it currently is so its a viable replacement for DX if (or when) Microsoft finally go completely off their rocker and try to screw PC gamers with it.
I think even Valve are involved with it as well.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Mr Ink 5000 said:
OneCatch said:
Your graphics card itself is better than in the consoles (the PS4 is roughly equivalent to a 7850).
I was currious about this, so AMD7850 is the bench mark to beat when i upgrade? (obviously i'll got a little higher, just for a bit of raw power vs shitty porting?)
7850 is the low end of the range. The card itself is supposed to be a modified 7870.

http://www.extremetech.com/extreme/171375-reverse-engineered-ps4-apu-reveals-the-consoles-real-cpu-and-gpu-specs

I think it's a little weaker than the 7870 but I'm not sure yet. It could be a bit better.

We also need to see what using GDDR5 as reguar RAM really impacts. Traditional setups had a lot of latency but the custom design of the board eliminates latency of the GDDR5 while maintaining the good parts of video ram. What his means in terms of optimizing capabilities is yet to be seen. Could be a minor benefit or could end up being something interesting. Textures should be better than ever, that's for sure. But how efficiently can the GPU use the GDDR5 as video RAM? We won't know for some time. Sony could have pulled a giant f-you to the traditional PC scheme or things could play out exactly as we expect and it ends up being just a little better with optimisations.

As for CPUs, unless you do hours of video editing then the need for a powerful one is quickly becoming obsolete. In today's market the CPU is basically a glorified switchboard operator that offloads all the grunt work to the other guys while handling the background processes itself. So it's not going to make a big difference. The final output should be something comparable to a much more powerful machine than it is currently specced at but that may not be the case anymore. We'll have to see.

Either way, in the coming 5-7 years the pc mentioned above will be phased out and become old. The ps4 will too but the difference will be that developers will still be trying to cram games into its rusty skeleton and making it work whereas they are only going to program for whatever a standard PC is like in that time. That's the same reason why our modern games require PCs be 2GBs of RAM and such despite our consoles being so far behind.
 

RicoADF

Welcome back Commander
Jun 2, 2009
3,147
0
0
Baron Teapot said:
Hah! I'm exaggerating, I know, but this doesn't seem to apply to the XBox One, whose purpose was explained away as some sort of dedicated television-supplement.

It may be low-hanging fruit, but it frustrates me to see passionless morons wasting the time and energy of talented engineers, and having worked on a console for months their reward was to have Don Mattrick and the Microsoft PR team ruin it in a series of idiotic blunders. I just want to say that I'm glad he's no longer in charge of anything relevant to gaming. Bored housewives and unemployed stoners tend to be far less discerning, in my opinion, which is fine.
Your right, the XBO has too much extra crap wasting the systems resources which is why I went PS4, Sony seems to have remembered where the consoles strength lies and so far I've been quite happy with the result. I'm glad the idiot is gone too, but unfortunately the damage is done, the system was gimped by all the junk installed on it and the bad PR has ruined it. I was going to get both the PS4 and XBO on release, now the XBO will wait till it's $200 if that since I don't trust Microsoft to not pull another 180 with the online and Kinnect requirements.
 
Dec 16, 2009
1,774
0
0
Lightknight said:
Mr Ink 5000 said:
OneCatch said:
Your graphics card itself is better than in the consoles (the PS4 is roughly equivalent to a 7850).
I was currious about this, so AMD7850 is the bench mark to beat when i upgrade? (obviously i'll got a little higher, just for a bit of raw power vs shitty porting?)
7850 is the low end of the range. The card itself is supposed to be a modified 7870.

http://www.extremetech.com/extreme/171375-reverse-engineered-ps4-apu-reveals-the-consoles-real-cpu-and-gpu-specs

I think it's a little weaker than the 7870 but I'm not sure yet. It could be a bit better.

We also need to see what using GDDR5 as reguar RAM really impacts. Traditional setups had a lot of latency but the custom design of the board eliminates latency of the GDDR5 while maintaining the good parts of video ram. What his means in terms of optimizing capabilities is yet to be seen. Could be a minor benefit or could end up being something interesting. Textures should be better than ever, that's for sure. But how efficiently can the GPU use the GDDR5 as video RAM? We won't know for some time. Sony could have pulled a giant f-you to the traditional PC scheme or things could play out exactly as we expect and it ends up being just a little better with optimisations.

As for CPUs, unless you do hours of video editing then the need for a powerful one is quickly becoming obsolete. In today's market the CPU is basically a glorified switchboard operator that offloads all the grunt work to the other guys while handling the background processes itself. So it's not going to make a big difference. The final output should be something comparable to a much more powerful machine than it is currently specced at but that may not be the case anymore. We'll have to see.

Either way, in the coming 5-7 years the pc mentioned above will be phased out and become old. The ps4 will too but the difference will be that developers will still be trying to cram games into its rusty skeleton and making it work whereas they are only going to program for whatever a standard PC is like in that time. That's the same reason why our modern games require PCs be 2GBs of RAM and such despite our consoles being so far behind.
interesting read. i think I will be patient n see how it pans out, re optimisation vs pricing etc. i dont mind having to drop my settings, as long as the games are smooth and at 1080p

I like the comparison of switchboard operator. since I up graded, i think my AMD FX4650 hasn't hit 50% while gaming. hopefully i'll get a long life out of it.
 

The Lunatic

Princess
Jun 3, 2010
2,291
0
0
Zac Jovanovic said:
It'll be a while until developers utilize the new console hardware to a point where it can be a match for your PC, but it's possible and it's probably going to happen in a few years.
Not quite.

This generation isn't quite so innovative in terms of hardware.

There are no 'Reality Synthesizers', 'Cell Processors' or anything like that.

The apple has fallen pretty close to the tree this time, most console hardware of this generation is minor modifications on PC hardware.

The OS isn't quite so weighed down as a PC's might be, but, ultimately, the hardware is roughly analogous to a PC of a similar price.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Mr Ink 5000 said:
interesting read. i think I will be patient n see how it pans out, re optimisation vs pricing etc. i dont mind having to drop my settings, as long as the games are smooth and at 1080p
I am perfectly happy with 1080p but it looks like things are going to the ultra high HD pretty fast. As long as you're happy with it, it won't matter. But 4k (now apparently referring to anything 2160p or higher for some ridiculous reason) is starting to show up in TVs and in news while even 1440p is getting traction in the world wide market.

Article on resolutions and screens. [http://www.digitaltrends.com/mobile/smartphone-pixel-screen-tech-guide/#!AaGeP]

I like the comparison of switchboard operator. since I up graded, i think my AMD FX4650 hasn't hit 50% while gaming. hopefully i'll get a long life out of it.
Right, and I still remember a time when it was the CPU doing most of the work. Now with 64 bit environments allowing more RAM to be used we'll start seeing even more of that processing offloaded and unless you start hitting a wall in your video ram then you CPU shouldn't be getting hit all that much. There are still plenty of games I can play without breaching 10% of my CPU but that's with some really powerful GPUs and I will admit that my processor is a newer i7. This is all thanks to a bottlenecking of performance in CPUs and finding that RAM and GPU power is so much cheaper to utilize. Suddenly we have CPUs that are for the OS, background processes, and the occasional stepping in to augment Video and RAM processing.