While it's not much to see, but the last part of the recent developer video for Bayonetta 2 had a 1 second monster clip at the end that was assumed to be CGI and not real-time graphics. The director of Bayonetta 2 Yusuke Hashimoto confirmed in a recent tweet that it was in fact a debut of actual real-time graphics from the game itself.
Here is tweet: Link
If that's a glimpse of how built from the ground up games will look on Wii U, then there is no cause for any concern for how Wii U graphics are going to hold up against the competition. Different shades of "pretty" in this coming generation will probably be the norm and whoever has the best art style is going to stand out more when they put these modern graphical effects to use.
16 comments:
All the third-party games that were released during the Wii U's launch are ports, anyway. They were ported onto the system at the most 5-6 months, which didn't give them enough time to modify the graphics, much less polish the game. It's the reason why some of the games like Batman: Arkham City - Armored Edition barely looked better than the other versions and had framerate issues.
so what happened with the neogaf chipworks photo did that all fizzle out because they couldnt find something to call wiiu weak LOL,
real time CUT SCENE like resident evil 4 had maybe
Real Time is Real Time regardless.
looked at those wiiu gpu photo scans i think that the 140 plus gb bandwidth in edram is correct based on wii having the same 16k blocks in the edram witch comes to about 900 bit i cannot remember the exact figures
by ram k and the amount its clearly 2048bit up from 900ish in wii so the bandwidth is 140 gb plus WITCH I CALLED YEARS AGO ,and the wiiu power consumption is 40 watts peak plus ,WITCH AGAIN I CALLED....
the wiiu gpu is litterly dripping in edram and sram and theres a lot of custom logic also and theres that dense block of other edram
140gb vs xbox 360s 32gb LOL Still being called weak by morons BEYOND3D THREAD HSS GONE EVEN MORE MONG THAN NEOGAFS THESE PEOPLE ARE CRETINS
140GB BANDWIDTH JUST THE 32MB EDRAM ALONE
I don't get what you talking about unless you are talking about the PS2 version of RE4.
The cutscene in graphic sin the "GC" version of RE2 was regular in game graphics.
The PS2 version used recordings.
Damn son, PC like. This stomps anything we have seen on consoles thus far, can't wait till these games start pumpin out, gonna be off the hook. P.s RE4 dude, get some glasses you noob..
dave dave listen up dave are you there digital foundry proved nintendo put loads of very expensive edram and sram on a gpgpu for no good reason its there to look pretty apparantly and feeds NOTHINGNESS my god digital foundry have such expert abillitys and wiiu is just a gamecube with wait for it powerpc 750s of the shelf from 1972 iv also heard the wiiu gpu runs on coal and and is steam powered LOL @ DIGITAL FOUNDERY NEOGAF AND BEYOND3D THERE INSANITY KNOWS NO BOUNDS
WiiU GPU HAS MORE SRAM THAN PS3S CELL CPU, PS3S RSX GPU ,AND X360S XENOS GPU COMBINED THE THING HAS A HUGE POOL OF SRAM AND THE GPU CORE IS LITTERED WITH PROCESSING UNITS WITH THERE OWN CATCH (NOT UNDERSTANDING IT DOESNT MAKE IT GO AWAY DIGITAL FOUNDRY)
WiiU GPU has almost 4x the EDRAM of xbox 360 32mb +4mb vs 10mb its also emdedded directly into the gpgpu logic not seperate on a old fashioned bus
the 32mb along is rated at atleast 140gb bandwith im sure thats 500mhz so @ 550mhz its more likely 148gb bandwidth this doesnt include the 4mb block and the sram both of witch will be very high bandwidth so sram and edram comes to about 200 gb bandwidth alone ON CHIP DIRECTLY CONECTED TO LOGIC-PROCESSING
digital foundry and eurogamer the people 11 yr old sony fans get there spec info LOL
OH THESE PEOPLE ALSO STATED WII REMOTE ISNT FOR FPS I REST MY CASE
The GameCube/Wii cut scenes were real time and featured the same graphics as the gameplay, so what do you mean?
yeah thats what i mean the dev might just be stating its real time not game play its a second or two who cares we shall see the real thing soon
all i ment was LIKE RES EVIL 4 GC CUT SEENS...
real time is real time regsrdless WRONG real time playable vs real time QTE vs real time only watch CAN BE A HUGE DIFFERANCE
maybe dave wink can show e6760 benchmarks wink to neogaf comunity wink vs hd4850 wink as to how gflops is USLESS WAY OF JUDGING A GPU WINK
if e6760 beats a hd4850 in benchmarks then please explain the DERP FLOPS obsesion
lets start daves world a spec forum for the non retarded lol...
are a lot of you people are just blind or what? or just delusional? what is so impressive about those graphics? name one asset about those images that make the graphics engine, "next gen." take a look at this, highly comparable, real time image of Lair from 2006. 7 years ago!
http://ps3media.ign.com/ps3/image/article/651/651387/lair-20050916042717444.jpg
Nobody cares about fanboy "my console is better than your console ranting." I am convinced people who do so must be overcompensating for something.
Brian Kennedy it doesn't look as good as the Bayonetta 2 images.
anonymous is right. wii u image looks way better than lair image.
Post a Comment