(This GPU was pure speculation for Wii U, nothing more)
So yesterday, the website "Gaming Blend" posted a story about the Wii U memory bandwidth and GPU being more powerful than most thought.
Here is their original story: Link to page
Now while I don't disagree that the Wii U has a ton of untapped potential in the way of it's GPU and CPU usage in the way it was originally designed for use with the system; some of the information in that article was not correct and must be commented on since it deals directly with a topic I started on this website a while ago.
The article talks about theoretical memory bandwidth for the Wii U, not factual in any means. While it is true that the Wii U's 32MB of eDRAM could give it a great advantage when dealing with certain processing effects, Artifical Inteligence, Anti-Aliasing or an easier solution to reach 1080p, this eDRAM is not going to magically make the entire Wii U box have over 500GB per second in memory bandwidth. Lets be reasonable please.
On two separate occasions talented Wii U developer Shin'en has made statements on both the Wii U's memory bandwidth and the 32MB of eDRAM:
In regards to the Wii U eDRAM:
"Wii U eDRAM usage is
comparable to the eDRAM in the XBOX360, but on Wii U you have enough eDRAM to
use it for 1080p rendering. In
comparison, on XBOX360 you usually had to render in sub 720p resolutions or in
mutliple passes. Even if you don’t use
MSAA (MultiSample
Anti-Aliasing) you already need around 16Mb just for a 1080p
framebuffer (with double buffering). You simply don’t have that with XBOX360
eDRAM. As far as I know Microsoft corrected that issue and put also 32MB of
Fast Ram into their new console.We use the
eDRAM in the Wii U for the actual framebuffers, intermediate framebuffer
captures, as a fast scratch memory for some CPU intense work and for other GPU
memory writes.
Using eDRAM properly is a
simple way to get extra performance without any other optimizations."
In regards to the Wii U's memory bandwidth:
“The 1GB application RAM
is used for all the games resources. Audio, textures, geometry,
etc. Theoretical RAM bandwidth in a system doesn’t tell you too much because
GPU caching will hide a lot of this latency. Bandwidth is mostly an issue for
the GPU if you make scattered reads around the memory. This is never a good
idea for good performance.”
“I can’t detail the Wii U
GPU but remember it’s a GPGPU. So you are lifted from most limits you had on
previous consoles. I think that if you have problems making a great looking
game on Wii U then it’s not a problem of the hardware.”
Now we come to part about the "rumored" E6760 GPU that was used as a base for the Wii U, from supposed Software Engineer, Francisco Javier Ogushi Dominguez, who if you try to research you find that he is a forum poster on hdwarriors.com and has been found posting links back to this website and other Nintendo sites for his reference and has no credibility or proof of who is really is. See here for one of his posts: Link
The point is that no one started a rumor about the E6760 being the GPU base for the Wii U, it was only my own speculation from multiple points of reasoning that I gathered and posted over a year and a half ago in hopes of narrowing down what the performance of the GPU for the Wii U was going to be like. For some reason, forum posters and some websites changed that statement to "rumor" from speculation.
Here is my original speculation article in full: Speculation: Wii U GPU Based on AMD Radeon E6760
(The Wii U GPU pictured above is a beast in efficiency, not raw power)
The performance of the Wii U GPU does have a lot of similarities with the E6760 in power consumption and efficiency, but the design was something based on a GPU that existed before the E6760 was even around. However, there is no doubt that many of the characteristics of the more modern GPU technology from the E6760 and other similar GPU's were "Frankensteined" into what became the final Wii U GPU.
What makes the Wii U GPU and CPU combination unique from the PS4 or Xbox One, is the architecture. The whole fact that the Wii U architecture is designed around PPC and not x86 gives the system the ability to produce amazing results in a small package. This is a smart design choice for a console, since x86 design relies to a much greater degree on raw power and less efficiency to get better results. The Wii U has modern graphical features and shaders on par with DirectX 11, in combination with the Out-of-Order PPC CPU gives the system a very efficient way to produce Next-Gen modern effects in it's games with a fraction of the overall power that is needed in the other new consoles.
(Latest version of CryEngine 3 running on Wii U showing Modern Next Gen effects beyond PS3/360)
It also gives developers who design exclusively (not happening much right now) for a system designed like Wii U more possibilities for growth and further efficiency the more they make games for it. However porting x86 code over to the PPC, while possible, is not something most game developers are going to spend extra development time to do for the Wii U in it's current sales situation. So even though it has the ability to display modern looking games well beyond the last generation of consoles, the current porting performance from older consoles is killed by developers using code designed to work perfectly for the Xbox 360 which is based on 2005 hardware. Even if developers want to use the extra features of the Wii U to get more out of the game, they are simply not given the resources, time or money to do so. This leaves ports sometimes having better overall image quality but with slower frame rates on Wii U due to the CPU being used for tasks the eDRAM, GPU and DSP were designed to use, hence bogging down the performance but not bad enough to leave too much of a difference overall; but making it seem like the Wii U is even less powerful than the Xbox 360 when this happens. The same is true when you go back to the last 9 years with PC ports of Xbox 360 games running really poorly on PC due to Xbox 360 code simply being shifted over to the x86 platform without optimization, leading to many bugs and continued patches and driver updates for as long as years after release in some cases.
For the PS4 and Xbox One, there are pros and cons for a console designed in this way. The good thing is that for this generation their will be no difference from PC to these consoles in terms of game development. However, where these two consoles will fall behind quickly is in performance relative to the PC, since just recently you can now build a entry level gaming PC that will easily out perform the Xbox One and make it just a bit better performing overall than the PS4 due to the better Intel CPU. Take a look at the new GTX 750 Ti PC build at Nvidia's site for just over $500: Link
(The small form factor GTX 750 Ti outperforms XB1 and is close to par with PS4 for $149)
So when you think of game development in terms of same architectures, PC/PS4/XB1, the PC is going to be the clear cut winner more so than ever before this generation. No longer will devs be slowed down by 7 year console cycles based on Power PC technology since x86 will be universal for them. So in all honesty, if you are going the more powerful console route, why not just build a better PC for almost the same price? You'll get better graphics and easier to replace/upgrade parts. It will be interesting to see how the next 4-5 years go in terms of this. For this example, I see the Wii U design as a strength for Nintendo games to flourish the way they designed the console to run them. Just like it's always been. Many people and developers don't agree this type of design but if the Wii U was selling out at retailers right now, you can bet all these games would be ported over in a heart beat. A modern GPU is still a modern GPU, no way around that part.
(Nintendo will focus on what makes the Wii U Gamepad essential in 2014 to increase interest)
In it's current state, porting to Wii U will take extra coding, time and effort but if the sales pick up for the console you can bet there will still be some of those ports coming over to it in the future. Even if Wii U does not get any more ports from Third Parties, Nintendo has made strong ties with Indie Developers on the Wii U already with so many games lined up it's hard not to be excited for them. For now though, Nintendo must keep releasing great games like Donkey Kong Country Tropical Freeze, release the other heavy hitters and follow through on their promise of releasing software that shows the importance of the Wii U Gamepad.
Strong First Party Support from Sony is what got the PS3 out of the bad start it had for the first 2-3 years and they continued supporting it up until the PS4 release. As we are seeing, the millions of Sony fans that were won over during the PS3 era are now eagerly buying a PS4 due to loyalty and trust. Nintendo now has the same opportunity with the Wii U to keep supporting it with outstanding software consistently and rebuild it's core fan-base. By doing this Nintendo will be able to make the Wii U a system we all look back on with fond memories and keep us buying their future consoles.
9 comments:
It doesnt matter anymore. No one is going to take the Wii U seriouosly from a graphical/tech standpoint. If you own the console LIKE I DO jusy enjoy the games that are coming. They have a great lineup for 2014 so far just enjoy the games stop talking tech.
Stop talking tech? I have to defend previously made statements that were taken out of context and twisted on my website. This is the main reason for this post, that and talking tech is an interest of mine along with my readers.
good post
I applaud Nintendo for aiming for efficiency, I have a WiiU and support the amazing software that is has now and has coming.
But I believe they need to think a bit harder when designing a console based on 'power efficiency'. Because the WiiU isn't the fastest device around, load times are somewhat longer than they might otherwise have been, means that our WiiU systems are wasting time causing our TV's and sound-systems to be expending large amounts of power while we await load screens.
On the flip side, WiiU also needs to be given more credit because in many cases it renders separately for 2 screens, outputting potentially higher than a standard 1080p resolution.
Even though this is a well explained rational and downright believable explanation of the Wii U's technical capabilities, i still ultimately find it hard to believe that the Wii U is really any more powerul than the PS3/360. I wish devs would just take full advantage of all the different hardwares
Thank you for this well written article.
I know you can't take one great written article as gospel, but this sure paints tge Wii U in a much better light. Slightly Mad Studios seems to believe the Wii U has some juice under the hood. I truly believe it is more powerful than 360 and PS3...those are 8 & 9 year tech!! Just as, if memory serves, the SNES had a slower CPU than the Genesis, but made generally prettier games - GENERALLY! I believe it is a case again where if the hardware is specifically programmed for and optimized for excellent and downright gorgeous games can be made.
Wii U is deffinetly not as powerful as the xbox one or the ps4, but is obviously not so far behind as the wii was against 360 and ps3; at the very least wii u should be 1/3 or 1/4 of what ps4 is. But althoug it has less shader power it also packs more memory bandiwdth with edram; bandwidth may not look as important as shader power but it all depends on what techniques you use for the rendering
for example, if you want ton render a pretty good amount of multiple light sources on a game, you are better using deffered rendering instead of the traditional forward rendering. Why?, because deffered rendering uses far less shader power than forward rendering for the lights(in the worst case forward rendering may require shader power enough for objects*lights, while deffered rendering would require objects+lights)
http://www.ogre3d.org/tikiwiki/tiki-index.php?page=Deferred+Shading
"
Deferred Shading Advantages
The main reason for using deferred shading is performance related. Classing rendering (also called forward rendering) can, in the worst case, require num_objects * num_lights batches to render a scene. Deferred shading changes that to num_objects + num_lights, which can often be a lot less.
Another reason is that some new post-processing effects are easily achievable using the G-Buffer as input. If you wanted to perform these effects without deferred shading, you would've had to render the whole scene again.
"
but there is a catch, althoug dffered rendering requires far less shader power the trade off is bandwidth requirements. G buffers for the deffered rendering consume a pretty good amount of memory bandwith and even xbox one on ryse game had to sacrifice AA due to the esram bandwidth consumption for the g buffer targets
But in wii us case thats not a problem at all, shien is using triple buffering for the deffered rendering without problems and they just consume about 1/3 of the edram, the 720p with double buffering only consumes 7.1MB in wii u according to shinen. This means that the rumor of the 563.2GBs or more of edram bandwidth may be true ater all
here is shinen comments about the triple buffering for the g buffer on fast racing neo using 3.6MB of edram for each buffer
https://twitter.com/ShinenGames/status/483639073798500353
Wii U tech is really intresting to me because its so unknown. It makes the console intresting, and I think that alot of people may not want what its capable of known also. They just want people to pass it off as underpowered when this not known so they are being ignorant.
Post a Comment