Oh for God sake...
Look, the Eurogamer article is complete hokum. The Digital Foundry article from which it's taken is hokum.
The photos and analysis of the GPU were done by forum goers at NeoGAF. Eurogamer/DF are just picking and reporting stuff second hand. There's a whole Neogaf thread up and running on the topic. I've been following the thread, and actually
posted a thread leading directly to it. And there are quite a few users who are pretty annoyed at how DF/Eurogamer have completely misrepresented the facts thus far:
a) The Wii U GPU is not directly comparable to any off-the-shelf GPU available on the market. DF base their article on the assumption that it is based off R700 architecture. Before these photos, people were assuming that the GPU would at the least have core architecture that makes it comparable to other GPUs, such as the R700. We now know that's not the case at all. This GPU is an almost entirely custom job done by Nintendo and AMD. There's not a lot of off-the-shelf stuff there at all. Everything in the GPU has been tinkered with or redesigned somehow.
b) While direct comparisons are impossible, people are suggesting that a base comparison could be made to a Radeon 4600/4800 series. However, that's only half the picture. The chip has been so heavily modified that many of the people actually analysing the specs believe it may be functionally closer to a 5550 or even a 6760 in terms of output. Again, because the thing is so damn unconventional, no-one really knows. Therefore assumptions cannot be made.
c) There is still a whole load of stuff on the GPU die that hasn't been accounted for yet, which DF/Eurogamer completely ignore. Taken from the updated OP:
Digital Foundry claims that the GPU has a 320:16:8 core config. Even including the ROPs, ARM, DSP, Video codec and command processor this only accounts for 18 out of the 40 logical blocks on there, leaving the majority of the GPU logic unexplained.
DF are trying to offer analysis on the Wii U GPU while knowing less than half of what the logical blocks do. It doesn't matter what sort of tech you're reporting on, that's just bad journalism. If you're trying to make pronouncements on how capable a piece of tech is, make sure you fucking well know what all its parts do. Currently, there are still large parts of the board which are a total bloody mystery.
d) The Wii U currently draws less than half the power of the 360 and the PS3. Around 33W to their figure of 70W+. However the GPU is working, it's doing so with half the leccy of the HD twins. This is where things get
really confusing. We know the Wii U is capable of visuals the other two are not, as seen with Trine 2: Directors Cut and Frozenbyte's comments, but it is managing to output those visuals on half the power draw of either the PS3 or 360. The GPU itself is apparently drawing somewhere around 15W of power, possibly more if the other components have lower than standard electricity consumption.
e) Rather than sticking the old Wii GPU on the side of the motherboard to add backwards compatibility, Nintendo seem to have directly integrated the GPU workings of the Wii into the Wii U GPU itself. Essentially, the Wii U GPU has assimilated the Wii GPU into its own workings. What this means is anyone's guess, but many think this means that the BC components can actually be used to improve performance for Wii U games.
f) There's a shitload of eDRAM nestled on the board, which people are still trying out to work out the configuration and fucntion of. Last I checked, the consensus was that the die had eDRAM in a configuration of 32mb/4mb/1mb. Quite if this is the case and how it all works in practise, no-one is yet certain.
The GPU has currently got the actual tech analysts at Neogaf bamboozled, perplexed and confuzzled. The only people complaining on the thread that the system is horribly underpowered are those who haven't done any actual analysis. All the tech analysts seem to be agreeing that this is one slick piece of custom kit. It may not have the raw power of whatever high-end GPUs are currently in Nextbox and PS4, but this thing has been tailored, customised and optimised beyond what you'd expect from a console GPU. This isn't an off-the-shelf card, and you can't compare it as such. While there are a lot of individual components and parts which may be similar to other cards, the whole damn thing is so unconventional in the way it's put together, you can't just assume it's got the same performance as Radeon Model Such-and-such.
The actual thread
can be found here, for anyone who's interested. They're still updating it as we speak. Don't give the Eurogamer article the time of day. They trolled you. They took a bunch of info out of context, and presented it counter to what the
actual people doing the analysis were saying.
Here's a quite from Jim Morrison, one of the guys at Chipworks who actually
did the hi-res photos of the GPU.
Jim Morrison said:
Been reading some of the comments on your thread and have a few of my own to use as you wish.
1. This GPU is custom.
2. If it was based on ATI/AMD or a Radeon-like design, the chip would carry die marks to reflect that. Everybody has to recognize the licensing. It has none. Only Renesas name which is a former unit of NEC.
3. This chip is fabricated in a 40 nm advanced CMOS process at TSMC and is not low tech
4. For reference sake, the Apple A6 is fabricated in a 32 nm CMOS process and is also designed from scratch. It?s manufacturing costs, in volumes of 100k or more, about $26 - $30 a pop. Over 16 months degrade to about $15 each
a. Wii U only represents like 30M units per annum vs iPhone which is more like 100M units per annum. Put things in perspective.
5. This Wii U GPU costs more than that by about $20-$40 bucks each making it a very expensive piece of kit. Combine that with the IBM CPU and the Flash chip all on the same package and this whole thing is closer to $100 a piece when you add it all up
6. The Wii U main processor package is a very impressive piece of hardware when its said and done.
Trust me on this. It may not have water cooling and heat sinks the size of a brownie, but its one slick piece of silicon. eDRAM is not cheap to make. That is why not everybody does it. Cause its so dam expensive