Jump to content


3Dude

Member Since 09 Jun 2012
Offline Last Active Apr 08 2015 08:08 PM

#137662 Wii U GPGPU?

Posted by 3Dude on 24 November 2012 - 07:48 PM

I'd love to see what developers could due with the Havok physics engine by having it run on the GPU, since the engine has been demoed in the past using OpenCL.


The WIERD thing is, Havok, specifically Dave Gargan of Havok said they be running on the wii u cpu.

"The platform has its own unique features, and
has its own challenges as well. When we come
across any new particular platform, we optimize
specifically for some of the advantages that
those platforms offer over other platforms, and
Wii U has specific advantages that no other
platform has, and we optimize directly for those,
right down at the level of accessing the
hardware.
"I think we'll see things done on the Wii U that
we won't see on another platforms… I think
people will be genuinely excited with the range
of titles they're going to see come out."
The demonstration as presented at GDC can be
watched in the following video. The demo shown
used *****CPU-processed physics *****(as opposed to
GPU), which, Gargan said, *****would be the case
when the engine runs on Wii U.**** More Havok
videos can be found on their website ."

Whats going on Dave?
http://www.nintendow...interview/29477


#137518 Clearing up These Wii U CPU Articles

Posted by 3Dude on 24 November 2012 - 12:50 PM

While a little vague, it still makes me wonder how they would know that. But it does seem likely considering the weak CPU. A bit of a trade-off.


They got it from the tech fixation tear down.

http://www.thetechfixation.com/2012/11/wii-u-next-gen-tech-or-current-gen.html?showComment=1353724701327#c5304701270469153035

They simply misunderstood a little logical deduction, meant to be taken more logically than literally, and called it a confirmation because... Theyre zelda informer.


#137435 Developers Have It Wrong, The Wii U Is Powerful, It’s Next Generation Powerful

Posted by 3Dude on 24 November 2012 - 10:46 AM

Specs is all people are caring about. That's like people only caring about the ingredients in foods. Just play or eat it or not. I'm done with this internet nonsense. Well back to my Wii U. I'll just play my games which I'm not looking at specs. This isn't the Matrix. You are not looking at coding; you looking at the game.


You might have a point if this was the wii u games forum...

But this is the wii u hardware forum, the dedicated subject for this board IS the specs, not the games, so its not the place to complain about focusing on them in here.


#137075 Wii U GPGPU?

Posted by 3Dude on 23 November 2012 - 06:14 PM

you have any different in mind? But hey, I wont believe any fanboism fantasies you may have, little sheep.
everybody speaks about it and its more likelly a 5670.. the die size *including edram, fits.

Here is a different article about wii U, not a negative one as many forums (neogaf, ign, gamespot, etc) speak about...
you have any different in mind? But hey, I wont believe any fanboism fantasies you may have.
everybody speaks about it and its more likelly a 5670.. the die size *including edram, fits. http://www.zeldainfo...-generation-pow

But hey! Even that is zelda topic :)  ...  but everybody points to 5670 and some to 4xxx series.  http://www.thetechfi...urrent-gen.html


Please, PLEASE dont link to Zelda informer.


#137041 Wii U GPGPU?

Posted by 3Dude on 23 November 2012 - 05:10 PM

I guess I'll Post some of SHin ens impressions from the GPU since thats kinda what this thread is about



Concerning the graphical aspect, are the features supported satisfactory for you? Is the system allowing a good amount of effects not present or not really used on current gen consoles, such as tessellation? Iwata has promoted the GPGPU side of the chip, have you taken advantage of this?"


For Nano Assault Neo we already used a few tricks that are not possible on the current console cycle.
Due to the modern GPU architecture you have plenty of effects you can use to make Wii U games look better than anything you have seen on consoles before.


Neo’s resolution is 720p. Why is it not 1080p? Beside, we’ve witnessed jaggies and seemingly a lack of anti-aliasing in some other games footage, can you reassure us on the image quality of your title? With its more up-to-date GPU and other factors such as cache amount, the Wii U should be pretty capable in this area.



We can’t detail the Wii U graphics chip, but any modern GPU supports various anti-aliasing modes with the usual Pros and Cons. Many GPUs have a certain amount of AA even for ‘free’ when rendering. Usage of these modes depends on your rendering style (like forward or deferred) and other implementation details.

Nano Assault Neo is running in 720p yes. We had the game also running in 1080p but the difference was not distinguishable when playing. Therefore we used 720p and put the free GPU cycles into higher resolution post-Fx. This was much more visible. If we had a project with less quick motions we would have gone 1080p instead i guess.

It’s not a problem to make beautiful 1080p games on the Wii U. As on any console or PC such titles need ~200% more fill rate than 720p. You can use this power either for 1080p rendering or for more particles, better post-Fx, better materials, etc.


All this information is very impressive! But as actors within a global industry, you may be aware of the capabilities of competing next gen consoles as well as the technical evolutions seen in PC hardware or upcoming middleware like Unreal Engine 4. From what you’ve experienced on Wii U, do you think the system is future-proof, does it have the required feature-set of the next generation games (we’re talking about direct x 11/OpenGL 4 level features) and that it will run titles available on its rival platforms? Some players are afraid of living a situation similar to the Wii which couldn’t handle engines and software released on Xbox 360 and PS3.


We can’t be too specific on the Wii U hardware but you can’t compare anyway an OpenGl/DirectX driver version to the actual Wii U GPU. I can only assure that the Wii U GPU feature set allows to do many cool things that are not possible on any current console. The Wii U has enough of potential for the next years to create jaw-dropping visuals. Also remember the immense improvement we saw on the PS3 and XBOX360 over the years. I’m really excited to see what developers will show on the Wii U in the years to come.

http://www.nanoassau...20p60_8mbit.mp4



A nice HQ video of their Wii u Eshop Game Nano Assault Neo.


#136909 PotatoHog's daily lesson on the Wii U CPU/GPU situation

Posted by 3Dude on 23 November 2012 - 12:09 PM

I think lots of people on here are under estimating A10 (likely customized) graphics.For instance I had an a8 3500m APU laptop and although it's computing power wasn't great (low i3) the graphics were excellent able to run civilisation 5 at high settings.This was only one example I know but that was a mobile last gen apu no doubt Sony will have a newer desktop part.This I feel is a smart choice for Sony probably a lot cheaper than.the equivalent Nvidia gpu.Be interesting to see where Microsoft go as well.


No, its not under estimating the a10, they know what its capable of and its not enough for them, because they have psychotic unmeetable expectations for these systems.


#136670 Wii U 'Has A Horrible, Slow CPU' Says Metro Last Light Dev

Posted by 3Dude on 23 November 2012 - 05:24 AM

The who system draws 35W, the gpu is more than 5x the size of the cpu. Therefor the cpu should draw about 5 watts. Where is it written that the Wii has a ARM core for the OS? I don't think I read that from any release information unless its just rumor. 180% increase from working software is a great challenge. The software would have to be in alpha phase to consider such improvements.
developers aren't ussually ones to talk much about things like this, it took a long time before it was comon knowledge how hard it was to program for the ps3. Its not all doom and gloom since nintendo will deliver good first party games most likely and there will probably be good 360 and ps3 ports eventually.


No, YOU didnt hear about ps3 difficulties until later on down the road. Devs were whining about the horribly ineffecient cpu's in the ps3 AND 360, in order deep pipelined, crap branch prediction, with a 500 cycle miss penalty, all the way up until the point the wii was announced.

And the system draws 75 watts from the wall, depending on the effeciency rating of the psu, we are probably looking at a max draw of 60 watts. The system draws 33 watts when its not doing anything, and 35 watts when playing nsmbu (LOL). Id hardly consider that putting the system under an average load, you DO know every system ever made draws more power if the game is more demanding, or do you SERIOUSLY think the wii draws the same power playing nsmb vs skyward sword, or a 360 uses the same power when playing geometry wars vs halo 4. Your logic is extremely poor.

Ha ha. Are you serious? 1.8x optimization performance is a serious challenge on a new system?

Shin en: 'For instance, with only some
tiny changes we were able to optimize certain
heavy load parts of the rendering pipeline to 6x
of the original speed, and that was even without
using any of the extra cores.'

600% performance increase. on 1 core. with 3 cores on parellelizable code, 1800% performance increase.

Audiokinetic wwise middle ware, 2012 v2 (first optimizations made to wii u support.)

Wii Uâ„¢
Audio on Wiimote and rumble on Wiimote and
DRC is now supported. Delay, Peak Limiter and
RoomVerb have been optimized by 2.3x to 4.4x
on Wii U.

230% to 440% performance increase. (this is for porting games from ps360 using the wwise middleware, so they are using the cpu and not the audio dsp.)

http://www.audiokine...news/227-2012-2


#136502 Slow CPU Will Shorten Wii U's Life - DICE Dev

Posted by 3Dude on 22 November 2012 - 05:49 PM

The issue here is the cpu doesn't actually have very much resource. There are definately optimization gains but not to the point where it would suddenly make the cpu more powerful than that of the 360. Judging by the size of the chip, its very likely that nintendo only paid a little as possible and only have a cheap but sufficience CPU for current games as well as being fully backward compatible. It is most likely based on the Wii CPU because the size is too small for indirect emulation.
It would cost a lot of money for nintendo to pay for a completely new design from IBM like sony paid for the CELL. IBM has completely forgone hardware developement for small systems such as PCs and consoles in the recent years. They only focus on large hot and powerful server chips and such. The power7 chips are 200 watt chips that would never work in a console and would be very hard to scale down well enough. The cpu most likely contains the same power ISA as the Wii but with a more modern instruction decode and caching. Thus it probably isn't something completely modern.
The eDram is a buffer just like the cache, cache would be faster so it would be higher level. The eDram would only be there to buffer main memory so it would be used as thus.


You are confusing l1, with l2 (both within the processor core of this design there is no edram die, its all IN the processor core, thats what makes this ibms NEW ram technology READ THE DOCUMENTATION). The l2 buffers main memory so the l1 doesnt get caught waiting for the slow main mem to be available. Again, its called a memory hiearchy.

There is Nothing hot about power 7, its one of the most thermal and power savvy designs ever made (which is a real surprise after the break up of power mac), it has fantastic performance per watt which SCALES LINEARLY, for the love of cheese WILL YOU READ THE DAMN DOCUMENTATION.  You dont even have to read past page 1, how lazy can you be?


Power 7 was designed as a high performance low power low thermal envelope processor. They fit 8 cores in the same power thermal and socket size as the dual core (that means 2) power 6. Its 1/4 as hot, (talking per core), takes 1/4 the power, clocked over a Ghz slower, and delivers over twice the performance.

The way they accomplished this was with a new edram technology, as stated by ibm, in the official ibm documentation i posted,That you are refusing to read, the same ram technology ibm confirmed when they announced they were making quote 'an all new' cpu for wii u, said press release also previously linked to by me. This is why ibm watson said it shared the same technology. The edram technology is what made power 7 possible at such a small size per processor, low thermal envelope per processor, and at a lower clock speed than power 6.

Power 7 chips are NOT 200 watt monsters. You are talking about the power 7 server PACKAGE, which contains 8 power 7 processors two MASSIVE memory controllers, 32Mb l3 cache, and a HUGE i/o strip. Its in the documentation. IF YOU WOULD READ IT. Also in the documentation are smaller p7 packages for custom customers, including a half socket 1 or up to 4 core reduced pin package.... as stated in the documentation. Its a very adaptable processor, ibm will get a lot of mileage put of it and its dirivitives

Power 7, or processors based on its technology dont have to be large, hot, or even powerful. It simply gets fantastic performance per watt. If you only give a few watts, it outperforms processors in that power range because it gets better performance per watt.

Seriously, read the documentation.
  • Tre likes this


#136380 Wii U 'Has A Horrible, Slow CPU' Says Metro Last Light Dev

Posted by 3Dude on 22 November 2012 - 02:16 PM

HAHAHAHA ... u wanna make the topic intensive is it :) There is dozen of people been there and claimed, the wii U pad worked but the wii U "sample" was off... and some people moved the curtains behind the console away and it was a pc! emulated the wii U... Its not me who said that, I read it in nintendo.co.uk forums... Because the fight is very intensive there.. (months ago)


Youve never seen a devkit before have you?

Posted Image

OH MY GLOB!!! The Xbox, ps2, gamecube and dreamcast where all REALLY PC'S!!!!!!!


#136196 Slow CPU Will Shorten Wii U's Life - DICE Dev

Posted by 3Dude on 22 November 2012 - 10:00 AM

I will explain some basics. This is rough estimations.
Inside any cpu or digital device is transistors which make all the logic happen, generally the more transistor you can fit into a cpu the more performance it has provided you do it logically. Transistors at any fabrication process are spread out in such that you can only fit so many transistor in an area, this is called transistor density of cpu. The Wii U cpu use a 45nm fabrication process very simular to the way the cpu in the 360 slim is made. The smaller the process the more transistors you can fit into the same area. Current intel cpus are made using a 22nm process. Moore's Law allows has show that performance increases almost linear with the number of transistor you can put in.
The Wii U cpu is 32.76mm^2 as measured by anandtech, the xbox 360 cpu is about 88 mm^2 now using the same process as the Wii U, so transistor density should be about the same. The 360's cpu is 2.7 times the size of the Wii U's, meaning there is probably more than 2x the transistors inside the 360's cpu than the Wii U. This does not mean that the Wii U's cpu is only 50% the performance of the xbox 360 but it gives a good indication that the performance is significantly less. This estimate is probably off by as much as 50% but even then it would put the cpu less powerful than the 360's. So the performance of the Wii U's cpu when doing general processing tasks and doing AI, physics and memory accessing will be slower. I would probably think the Wii U's cpu is only about 80% of the performance as the 360 in gaming. This makes it very hard to port a game without a lot of work, the good thing is the cpu isn't the limiting factor inside the 360 most of the time so it won't show up too much if the developers are porting well.
The GPU inside the Wii U is about 2x as fast as the 360. This will allow Wii U to do games at almost 2x the graphics details. GPGPU can't be easily added to existing game engines. It is very different than coding for just a cpu and gpu, it will take some time to get used to. The 360 also has gpgpu functions. This will not solve all the processing limitations in the long run.
Overall the balance isn't great, Nintindo's console will probably be able to play current gen games but just that. With the next generation of consoles, the Wii U will be like the Wii. I wouldn't be surprised if an Wii U emulator be made by the time the nextbox and ps4 come out for the PC.
These are all my estimates and from my experience as a person with a little cpu design experience. The overall result shouldn't be too far off but can still have a pretty large margine of error as nintendo is keeping all the detail secrete.


THANK YOU. I actually GAVE somebody the cpu size and transistor density information yrsterday hoping they had the knowledge to make a decent argument with it, and they COMPLETELY ignored it, and continued circle jerking baseless hearsay.

Nintendos not keeping TOO many secrets this time. And what little they are was given away a long time ago by partners like ibm.

So, we KNOW Nintendos cpu is transistor starved because of its size right? Physics dictate you can only have so many objects of the same size in a given space. So HOW are devs who are actually MAKING wii u games getting the cpu performamce out of the system that they are? Seriously, look at wonderful 101, that is a LOT. Nano assault is only using one core, and has thousands of simultaneous characters and objects on screen simultaneously. HOW?

Clock speed and even cores is no longer the main impediment to performance, (thanks to amdahl, we cant really do much more no matter HOW many cores we add) ram latency is.

The wii u cpu has embedded ram. But not just any embedded ram, as shown in ibm's original press release, uses ibm's newest embedded ram technology. NO Its not l3, thats something else. Its the l1.

http://www-03.ibm.co...lease/34683.wss

' The all-new, Power-based microprocessor will
pack some of IBM's most advanced technology
into an energy-saving silicon package that will
power Nintendo's brand new entertainment
experience for consumers worldwide. IBM's
unique embedded DRAM, for example'

Anybody can make edram. Only ibm can license this stuff. Its edram thats bussed six ways each cell, meaning each cell performs like six transistor sram (6tsram, remember cubes blazing fast 1tsram back in the day? Yeah, 6tsram. Its the technology that made power 7 possible. Power 7 only has HALF the transistors as i7, yet it doesnt just fompete with i7, it beats it. This edram is the reason why. It allows the transistors to effectively perform as over 2x their number. Power 7 only has 1.2 billion transistors (only, lol) However the unique edram allows it the equivilant performance of a 2.7 billion transistor chip. THATS how Nintendo managed such a small cpu.

Read it from the official power 7 documentation I have attached that details ibm's (then) new unique edram technology.

Attached Files




#136169 Slow CPU Will Shorten Wii U's Life - DICE Dev

Posted by 3Dude on 22 November 2012 - 09:27 AM

Whenever I hear this, I keep asking "Why doesn't anyone step forward and explain why this is the case?" They say the Wii U CPU is underpowered, but underpowered compared to what? I'm still waiting on a technical breakdown of the Wii U with an explanation of why they think the way they do. Show us where Nintendo fell short with the CPU in relation to what's needed in developing today's console games so we can have a meaningful discussion.
Also, look at the first thing this guy says “I don't actually know what makes it slow"
At that point, he should shut up. If HE doesn't know then all he is doing is gossiping. Correct?


They are saying it BECAUSE the wii u cpu IS clocked slower than the 360/ps3 cpu. Like that even matters unless you are trying to run ps360 code. Short pipelined architecture can run code twice as fast as deep pipelines at half the clock speed. And xenon and cell were piped DEEP, And had some of the crappiest branch prediction ive ever seen.

Well, this guy is just speculating on hearsay. But the hearsay he is speculating on is just based on early looks at v1/2 devkits like the metro dev is talking about, and not the v4 v5 devkits that saw 600% increases in performance according to NAMED DEVS who actually HAVE wii u games being made.

wwise has said their middleware has improved 4-500% with optimizations made on the v5 devkit back in september, and they even released a cpu bench mark. But why should anybody pay attention to official wii u performance benchmarks from an official middleware dev. I mean, you guys sure as hell dont. Rumours from 8 months  4 devkit versions from ago are far more reliable than audiokinetics middleware performance benchmarks, so you guys dont need to see the benchmarks, do you?


#134992 Wii U 'Has A Horrible, Slow CPU' Says Metro Last Light Dev

Posted by 3Dude on 20 November 2012 - 05:15 PM

lol this guy is a true hater, always negative comments, i dont really know why are you still around here


Orion is not a hater, hes just highly emotional and easily manipulated.

Hes gone from being super excited and WAAAAAYYYY over shooting the capabilities of the wii u, to down in the dumps, and then back up to over estimating the system through like, several cycles now. Crazy rumours have made it a rough ride for him.


#134940 Are people over-praising the Wii U?

Posted by 3Dude on 20 November 2012 - 03:56 PM

Lets jst wait until ALL the facts come out.


All the facts will never be out.


#134577 Wii U 'Has A Horrible, Slow CPU' Says Metro Last Light Dev

Posted by 3Dude on 20 November 2012 - 10:29 AM

http://www.notenough...he-wii-u-power/

More specifically, we’ve heard rumors about the
CPU, that it’s supposedly the weakest link of the
system. Word has spread that it’s some sort of
Broadway (Wii CPU) but in a three-core
configuration and improved. Others have
argued that based on its reduced size seen in
recent pictures and the overall low consumption
of the unit, it is not very powerful. Have you
encountered any problems during your
development because of this component or is it
efficient enough?

We didn’t have such problems. The CPU and
GPU are a good match. As said before, today’s
hardware has bottlenecks with memory
throughput when you don’t care about your
coding style and data layout. This is true for any
hardware and can’t be only cured by throwing
more megahertz and cores on it. Fortunately
Nintendo made very wise choices for cache
layout, ram latency and ram size to work against
these pitfalls.


#134554 Wii U 'Has A Horrible, Slow CPU' Says Metro Last Light Dev

Posted by 3Dude on 20 November 2012 - 09:47 AM

Many developers have confirmed the Wii U has a slow CPU, they've had the final development kits since E3.

http://www.slashgear...loper-21248895/
http://www.videogame...n_producer.html
http://arstechnica.c...-with-slow-cpu/
http://www.gameplane...3-and-Xbox-360/


Ah, very nice, butchered articles with cherry picked negative clips. Alow me to insert what was ACTUALLY said.
http://www.slashgear...loper-21248895/

"One of the weaknesses of the Wii U compared to PS3 and Xbox 360 is the CPU power is a little bit less," he said. "So for games in the Warriors series, including Dynasty Warriors and Warriors Orochi, when you have a lot of enemies coming at you at once, the performance tends to be affected because of the CPU.

The cpu is clocked lower. Playing the Nintendo multiples game with the rams clock speed of 800Mhz, Im going to guess its clocked aqround 2.4 Ghz (3x multiplier). That means less flops, which is the only thing the ps3 and 360 can do without crapping the bed, so all games are designed around it. Wii u's cpu is strong in General processing tasks, like cpu's are supposed to be. That means you cant simply port 360/ps3 games to wii u, without relying on how the wii u can handle flops and simd loops, you have to redesign it for the system, which no port team has the resources or knowledge to do. Oh no Im not done, this is just the pre strike. Now Im going to post the quote that was left out of NINETY PERCENT OF THE REPOSTED NEWS STORIES because it didnt paint the controvertial image of wii u power the site wanted for clicks.

"With the Wii U being new hardware, we're still getting used to developing for it, so there are still a lot of things we don't know yet to bring out the most of the processing power. There's a lot that still needs to be explored in that area."

Yeah. Not in the article you posted is it? Wonder why.

http://www.videogame...n_producer.html

I cant believe you posted this. Harada went apehorse scat* all over twitter stating how his words were changed from the interview to give a slant to the article he never intended. He got the original interview taken down and an apology issued. We were laughing about this for weeks.


http://arstechnica.c...-with-slow-cpu/

Are you serious? This is a repost of the first story from a different site.

http://www.gameplane...3-and-Xbox-360/

Okay, seriously dude, youve used the SAME EXACT STORY three times. All three of your articles conveinently removed the fact they explained their problems were because they didnt know how to use the wii u cpu yet.

So you have a team that admits they havent figured the cpu out yet, and another one where the interview raged about his words being changed. Great stuff.

Someone post the Shin en cpu interview, Im tired of doing it.




Anti-Spam Bots!