RhombusHatesYou said:
Wieke said:
It reportedly includes a three-core custom IBM PowerPC CPU, an ATI R700 GPU, and 512 MB of RAM.
Please let this just be a rumour, that is some woefully atrocious hardware. Only 512 mb and an old graphics processor. (Seriously, the R700 was two generations ago.) Personally i'm hoping it contains some of that sexy new AMD Fusion hardware but it's probably to early for those.
The problem with current Fusion APUs is that they're aimed at low to medium end general purpose and multimedia systems.. which is to say the netbook, low end laptop, office desktop and HTPC markets. Excellent for those markets (and profitable, too) but not really something you'd use for HD gaming. Also, there's no way AMD would let any of their Fusion tech be put on an IBM custom CPU.
I do have to wonder why Nintendo would use a custom CPU when using off the shelf served them so well for the Wii, though.
To my knowledge, the Wii CPU
is a 'custom' CPU. It's 'custom' in the sense that it's a variant of a mainstream IBM cpu, but custom nonetheless, in that it isn't shared by any other device.
Really, 'custom' is highly unlikely to mean something altogether new built from the ground up.
klasbo said:
512MB of RAM?
The R700 cards all have 512MB or higher (I think; I haven't seen a 4xxx card with less),and having less "normal" RAM that video RAM makes no practical sense... RAM prices are at an all-time low nowadays, so 4/8GB makes more sense.
You'd also want to have the ability to run DX11 (just to open up possibilities for developers working on other current-gen systems) as well as 64-bit processes (we're moving forward, people, don't fall behind!). There is no reason to make something "new" when it is out-spec'd by 3-4 year old systems. It's not like the "Wii mk.2" is coming out tomorrow, after all.
Wake me when they put an intel i7 2600 in it. And give it mouse/keyboard support.
OK, seriously, DX11 is an API. The hardware is built to the API specification, but that
doesn't mean the hardware capabilities are
identical to the specification.
Case in point: ATI X1000 series graphics chips can do some funky stuff with looping the output of the pixel shader into the input of the vertex shader. (in effect allowing them to do vertex shader operations with a pixel shader.) - This is not at all part of the DirectX 9 specification, so almost no PC games use this ability, since it would require ATI specific extensions to the game code.
By contrast, you can almost guarantee anyone that built a console based on an X1000 chip
would use such an ability.
Or another example: Hardware tessellation is a feature defined for DirectX 11 as mandatory. Yet, while pre-directX 11 NVIDIA chips could not do tessellation, ALL ATI chips since the 2000 series have had the capability anyway, but again, it was going unused because it wasn't part of the core DirectX spec. Hence, any PC game using it would have needed ATI specific code to do so.
Again, any console built around such a chip would have no reason to ignore such an ability.
In short, PC hardware often has capabilities beyond the DirectX API it was developed to work with, which rarely gets used because developers using DirectX tend to stick with the lowest common denominator.
Console developers, by contrast, have a fixed hardware design. And will gladly use anything and everything the hardware is capable of, because they don't have to worry about their code running on anything else.
Having said that, it still seems stupid to use an R700 (X4000 series) chip for a product intended to be released in 2012. The only explanation I can think of, is the lead time involved in creating a hardware product.
Though I find it suspicious that this would be 4 years, considering what the hardware in the 360 and PS3 is. (PS3 hardware for instance, was no more than a single generation out of date at time of launch; - if these specs are true, by 2012, the new Nintendo system will be about 4 generations behind.)