Jump to content


Photo

Wii U GPGPU?


  • Please log in to reply
45 replies to this topic

#21 Mike1986

Mike1986

    Goomba

  • Members
  • 8 posts

Posted 23 November 2012 - 10:45 PM

People first of all we are talking about MOBILE version of these GPU's. TDP don't lie, sticker can say even ATI Radeon 6870 crossfire but it isn't.

It is very low end gou as it taking less than 20watts (if super slow cpu with super slow ram and other parts takes silly 10 watts)


People remind you about physics in cpu engineering, there is no way to offer full potential like gpu/cpu from even very low end pc in wii u as all parts has to be very slow and mobile versions...

#22 Dragon

Dragon

    Pokey

  • Members
  • 1,070 posts
  • Fandom:
    Playing my Wii U!

Posted 24 November 2012 - 06:11 AM

I guess I'll Post some of SHin ens impressions from the GPU since thats kinda what this thread is about



Concerning the graphical aspect, are the features supported satisfactory for you? Is the system allowing a good amount of effects not present or not really used on current gen consoles, such as tessellation? Iwata has promoted the GPGPU side of the chip, have you taken advantage of this?"


For Nano Assault Neo we already used a few tricks that are not possible on the current console cycle.
Due to the modern GPU architecture you have plenty of effects you can use to make Wii U games look better than anything you have seen on consoles before.


Neo’s resolution is 720p. Why is it not 1080p? Beside, we’ve witnessed jaggies and seemingly a lack of anti-aliasing in some other games footage, can you reassure us on the image quality of your title? With its more up-to-date GPU and other factors such as cache amount, the Wii U should be pretty capable in this area.



We can’t detail the Wii U graphics chip, but any modern GPU supports various anti-aliasing modes with the usual Pros and Cons. Many GPUs have a certain amount of AA even for ‘free’ when rendering. Usage of these modes depends on your rendering style (like forward or deferred) and other implementation details.

Nano Assault Neo is running in 720p yes. We had the game also running in 1080p but the difference was not distinguishable when playing. Therefore we used 720p and put the free GPU cycles into higher resolution post-Fx. This was much more visible. If we had a project with less quick motions we would have gone 1080p instead i guess.

It’s not a problem to make beautiful 1080p games on the Wii U. As on any console or PC such titles need ~200% more fill rate than 720p. You can use this power either for 1080p rendering or for more particles, better post-Fx, better materials, etc.


All this information is very impressive! But as actors within a global industry, you may be aware of the capabilities of competing next gen consoles as well as the technical evolutions seen in PC hardware or upcoming middleware like Unreal Engine 4. From what you’ve experienced on Wii U, do you think the system is future-proof, does it have the required feature-set of the next generation games (we’re talking about direct x 11/OpenGL 4 level features) and that it will run titles available on its rival platforms? Some players are afraid of living a situation similar to the Wii which couldn’t handle engines and software released on Xbox 360 and PS3.


We can’t be too specific on the Wii U hardware but you can’t compare anyway an OpenGl/DirectX driver version to the actual Wii U GPU. I can only assure that the Wii U GPU feature set allows to do many cool things that are not possible on any current console. The Wii U has enough of potential for the next years to create jaw-dropping visuals. Also remember the immense improvement we saw on the PS3 and XBOX360 over the years. I’m really excited to see what developers will show on the Wii U in the years to come.

http://www.nanoassau...20p60_8mbit.mp4



A nice HQ video of their Wii u Eshop Game Nano Assault Neo.


Some good news about the Wii Us power? I WONDER WHY IGN NEVER REPORTED IT?

Posted Image


#23 Alex Wolfers

Alex Wolfers

    Thy Fur Consumed

  • Members
  • 2,768 posts
  • NNID:AxGamer
  • Fandom:
    Furry Fandom,gaming,trolling

Posted 24 November 2012 - 07:06 AM

I hope this nails the coffin on the CPU complaining.

Signature_DK.png


#24 Arkhandar

Arkhandar

    Dry Bones

  • Members
  • 479 posts
  • Fandom:
    Zelda, Metroid, Mario, Kirby, DK

Posted 24 November 2012 - 08:02 AM

Please no Ad hominoms, just state the information you have that contradicts his.

Ad hominem*

People first of all we are talking about MOBILE version of these GPU's. TDP don't lie, sticker can say even ATI Radeon 6870 crossfire but it isn't.

It is very low end gou as it taking less than 20watts (if super slow cpu with super slow ram and other parts takes silly 10 watts)


People remind you about physics in cpu engineering, there is no way to offer full potential like gpu/cpu from even very low end pc in wii u as all parts has to be very slow and mobile versions...

Where are you getting this info, oh almighty insider?

Now seriously, you do know that the Wii U, just like other more recent Nintendo consoles, use a dynamic power management in order to maximize performance an save energy, right?

The Wii U could even have an HD7990 inside its guts while using only 32 watts, but in order to achieve that it would only be using a small portion of its horsepower.

Since the Wii U has 75 watt peak limit, and around 60 watts are usable, I highly doubt it has a mobile GPU inside.

Edited by Arkhandar, 24 November 2012 - 10:05 AM.

If you try to fail and succeed, which have you done?

Posted Image

#25 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 24 November 2012 - 08:20 AM

Ad hominem*
Where are you getting this info oh almighty insider.
Now seriously, you do know that the Wii U, just like the more recent Nintendo consoles, use a dynamic power draw in order to maximize performance an save energy, right?
It could even have an HD7990 inside while using only 32 watts, but in order to do that it would only use a small portion of its horsepower.
Since the Wii U has 75 watt peak limit, I highly doubt it has a mobile GPU inside.


Ha thanks!

Oh, one correction from me as well, although it has no impact on the solidity of your argument.

The wii u psu is 75 watts, thats drawn from the wall. The system doesnt get that much, some of that is going to be lost, we are probably looking at around 60 watts for the system.

Although, thats still FAAAAAAAR from sub watt to a few watts mobile processors.

banner1_zpsb47e46d2.png

 


#26 blu gamepad

blu gamepad

    Red Koopa Troopa

  • Members
  • 65 posts

Posted 24 November 2012 - 02:34 PM

GPGPU: Grand Prix GPU

It will run laps around other gpu's

*runs*

#27 Sobari

Sobari

    Spiked Goomba

  • Members
  • 11 posts

Posted 24 November 2012 - 07:19 PM

I'd love to see what developers could due with the Havok physics engine by having it run on the GPU, since the engine has been demoed in the past using OpenCL.

#28 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 24 November 2012 - 07:48 PM

I'd love to see what developers could due with the Havok physics engine by having it run on the GPU, since the engine has been demoed in the past using OpenCL.


The WIERD thing is, Havok, specifically Dave Gargan of Havok said they be running on the wii u cpu.

"The platform has its own unique features, and
has its own challenges as well. When we come
across any new particular platform, we optimize
specifically for some of the advantages that
those platforms offer over other platforms, and
Wii U has specific advantages that no other
platform has, and we optimize directly for those,
right down at the level of accessing the
hardware.
"I think we'll see things done on the Wii U that
we won't see on another platforms… I think
people will be genuinely excited with the range
of titles they're going to see come out."
The demonstration as presented at GDC can be
watched in the following video. The demo shown
used *****CPU-processed physics *****(as opposed to
GPU), which, Gargan said, *****would be the case
when the engine runs on Wii U.**** More Havok
videos can be found on their website ."

Whats going on Dave?
http://www.nintendow...interview/29477

banner1_zpsb47e46d2.png

 


#29 Sobari

Sobari

    Spiked Goomba

  • Members
  • 11 posts

Posted 24 November 2012 - 08:01 PM

Odd...it's possible that GPU-accelerated physics wouldn't leave enough processing power for the GPU to handle Xbox 360/PS3 level graphics, but who knows. While the Wii U's GPU is undeniably more powerful, it's highly likely that it can't handle such a massive task while maintaining that status.

#30 MorbidGod

MorbidGod

    Hammer Bro.

  • Members
  • 1,717 posts

Posted 24 November 2012 - 08:49 PM

you suggested a 4770 which does not make any sense whatsoever


That was one of the original rumors.
Whovian12 -- Nintendo Network ID.

#31 Arkhandar

Arkhandar

    Dry Bones

  • Members
  • 479 posts
  • Fandom:
    Zelda, Metroid, Mario, Kirby, DK

Posted 25 November 2012 - 07:15 AM

The WIERD thing is, Havok, specifically Dave Gargan of Havok said they be running on the wii u cpu.

"The platform has its own unique features, and
has its own challenges as well. When we come
across any new particular platform, we optimize
specifically for some of the advantages that
those platforms offer over other platforms, and
Wii U has specific advantages that no other
platform has, and we optimize directly for those,
right down at the level of accessing the
hardware.
"I think we'll see things done on the Wii U that
we won't see on another platforms… I think
people will be genuinely excited with the range
of titles they're going to see come out."
The demonstration as presented at GDC can be
watched in the following video. The demo shown
used *****CPU-processed physics *****(as opposed to
GPU), which, Gargan said, *****would be the case
when the engine runs on Wii U.**** More Havok
videos can be found on their website ."

Whats going on Dave?
http://www.nintendow...interview/29477

Something is seriously wrong here. First we see these amazing CPU-only tech demos and praises that have superior quality when comparing it with last generation consoles, and now we're are hearing that the Wii U has supposedly got a "horrible CPU". These statements simply don't match.

I'm not trying to start another major debate here, but yet again, it's staring to look like we're seeing a POWER7-based Wii U CPU situation. Everything is pointing at it.
If you try to fail and succeed, which have you done?

Posted Image

#32 Foot

Foot

    The most badass sociopath to ever exist.

  • Members
  • 1,038 posts
  • NNID:DPapcinEVO
  • Fandom:
    Sock Wars, Shoehorn Leghorn

Posted 25 November 2012 - 07:22 AM

Well about the GPGPU, I've personally asked AMD before via email, and they confirmed to me that it was a customized e6760.
I am the foot. I do not like you. You smother me with socks and shoes, then step on me thousands of times a day.

We foot will rebel one day.

#33 Arkhandar

Arkhandar

    Dry Bones

  • Members
  • 479 posts
  • Fandom:
    Zelda, Metroid, Mario, Kirby, DK

Posted 25 November 2012 - 09:45 AM

Well about the GPGPU, I've personally asked AMD before via email, and they confirmed to me that it was a customized e6760.

Do you currently know its core size so that we can compare it with the Wii U GPU size?
If you try to fail and succeed, which have you done?

Posted Image

#34 Sobari

Sobari

    Spiked Goomba

  • Members
  • 11 posts

Posted 25 November 2012 - 09:51 AM

Well about the GPGPU, I've personally asked AMD before via email, and they confirmed to me that it was a customized e6760.


I think a 6760 would be way too new and expensive for Nintendo to consider using it. Plus, I think most rumors have pointed it out as being based on the R700 series, which would be the Radeon HD 4000 series.

#35 Plutonas

Plutonas

    Pokey

  • Members
  • 1,319 posts

Posted 25 November 2012 - 11:04 PM

you aren't very smart are you?


I reply again to this comment.. after a small search, I found out why I typed 4770... because everybody in tech forums, claims that is a 4770... The die size fits...

And its a much better gpu than 5670 or E6670... meaning.. if its truly a radeon 4770 chipset inside... It may be 3-4 times better than 360. If they optimize the engines with gpus, instead of cpus.

BUT HEY... I believe, that it may be a good thing that nintendo did that.. Time will tell. If we consider that the new engines, speak about tflops for the gpus.. and not cpus... then maybe the new engines, need gpus instead of cpus.
And we may be surprised in the near feature.. who knows.

as for the size of the gpu.. Search the teardown of Anandtech.

Edited by Orion, 25 November 2012 - 11:10 PM.


#36 Arkhandar

Arkhandar

    Dry Bones

  • Members
  • 479 posts
  • Fandom:
    Zelda, Metroid, Mario, Kirby, DK

Posted 26 November 2012 - 06:29 AM

I reply again to this comment.. after a small search, I found out why I typed 4770... because everybody in tech forums, claims that is a 4770... The die size fits...

And its a much better gpu than 5670 or E6670... meaning.. if its truly a radeon 4770 chipset inside... It may be 3-4 times better than 360. If they optimize the engines with gpus, instead of cpus.

BUT HEY... I believe, that it may be a good thing that nintendo did that.. Time will tell. If we consider that the new engines, speak about tflops for the gpus.. and not cpus... then maybe the new engines, need gpus instead of cpus. And we may be surprised in the near feature.. who knows.

as for the size of the gpu.. Search the teardown of Anandtech.

!
If you try to fail and succeed, which have you done?

Posted Image

#37 Plutonas

Plutonas

    Pokey

  • Members
  • 1,319 posts

Posted 26 November 2012 - 07:38 AM

Something is seriously wrong here. First we see these amazing CPU-only tech demos and praises that have superior quality when comparing it with last generation consoles, and now we're are hearing that the Wii U has supposedly got a "horrible CPU". These statements simply don't match.

I'm not trying to start another major debate here, but yet again, it's staring to look like we're seeing a POWER7-based Wii U CPU situation. Everything is pointing at it.


Edited by Orion, 26 November 2012 - 08:02 AM.


#38 Alex Atkin UK

Alex Atkin UK

    Boo

  • Members
  • 528 posts

Posted 26 November 2012 - 08:06 AM

Let's not forget, they didn't just slap a desktop or even mobile GPU into the Wii U and call it a day.

The Wii U uses a unified memory architecture similar to the Xbox 360, standard GPUs have dedicated RAM, its a very different architecture and may actually help to get more performance out of the GPU than a comparable GPU on a PC.

Just look at the power consumption of an Intel mobile CPU which also has a GPU built in, its only 35" TDP. Then consider that the Wii U CPU is unlikely to be anywhere near that powerful, it doesn't need to be, so you end up with a lot of leeway to get a good combination of hardware into the Wii Us 50W TDP.

You might argue that the Intel GPU is poor for gaming, but that is because the primary focus is for CPU performance not GPU. It makes no sense to design a laptop CPU purely around gaming, because very few people are buying a laptop with gaming their primary purpose. The problem then becomes two fold as there is no good reason for the developers of the games to optimise for that architecture either, as any serious gamer is using a dedicated GPU. With a games console its the opposite, so you can get extremely good gaming performance from a unified architecture, as Xbox 360 has shown.

As for the CPU, again, all we know is its clocked slower than Xbox 360. All that means is its slower than 3.2Ghz, which means exactly nothing as if its a more efficient CPU architecture it could still easily be faster in actual use, but might need more effort from the developers to use that power. THAT is why are few developers are moaning about "clocked slower". Not necessarily that the Wii U CPU is slow, but that they can't just do a shoddy port and have it perform the same as Xbox 360.

The same thing applies to the PS3 and they moaned about it. The only reason they are picking on Wii U now is because like with the PS3, they have alternatives with an established market so its a gamble for them to support Wii U when it means it will cost them more money to do so.

Yes the odds are that the Wii U will be considerably less powerful than the new Xbox and Playstation, but I still do not think its all doom and gloom. If they are more similar architecture then the cost of porting from the new consoles may actually be less than it is now to port from Xbox 360/PS3.

We have been through all this before. Many first-generation Xbox 360 games looked like Xbox games, so should we start panicking because first-generation Wii U games look like Xbox 360 games? I don't think so.

Edited by Alex Atkin UK, 26 November 2012 - 08:19 AM.

Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK

 

How to improve the Wii U download speed.


#39 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 26 November 2012 - 09:45 AM

Let's not forget, they didn't just slap a desktop or even mobile GPU into the Wii U and call it a day.
The Wii U uses a unified memory architecture similar to the Xbox 360, standard GPUs have dedicated RAM, its a very different architecture and may actually help to get more performance out of the GPU than a comparable GPU on a PC.
Just look at the power consumption of an Intel mobile CPU which also has a GPU built in, its only 35" TDP. Then consider that the Wii U CPU is unlikely to be anywhere near that powerful, it doesn't need to be, so you end up with a lot of leeway to get a good combination of hardware into the Wii Us 50W TDP.
You might argue that the Intel GPU is poor for gaming, but that is because the primary focus is for CPU performance not GPU. It makes no sense to design a laptop CPU purely around gaming, because very few people are buying a laptop with gaming their primary purpose. The problem then becomes two fold as there is no good reason for the developers of the games to optimise for that architecture either, as any serious gamer is using a dedicated GPU. With a games console its the opposite, so you can get extremely good gaming performance from a unified architecture, as Xbox 360 has shown.
As for the CPU, again, all we know is its clocked slower than Xbox 360. All that means is its slower than 3.2Ghz, which means exactly nothing as if its a more efficient CPU architecture it could still easily be faster in actual use, but might need more effort from the developers to use that power. THAT is why are few developers are moaning about "clocked slower". Not necessarily that the Wii U CPU is slow, but that they can't just do a shoddy port and have it perform the same as Xbox 360.
The same thing applies to the PS3 and they moaned about it. The only reason they are picking on Wii U now is because like with the PS3, they have alternatives with an established market so its a gamble for them to support Wii U when it means it will cost them more money to do so.
Yes the odds are that the Wii U will be considerably less powerful than the new Xbox and Playstation, but I still do not think its all doom and gloom. If they are more similar architecture then the cost of porting from the new consoles may actually be less than it is now to port from Xbox 360/PS3.
We have been through all this before. Many first-generation Xbox 360 games looked like Xbox games, so should we start panicking because first-generation Wii U games look like Xbox 360 games? I don't think so.


Wii u does not have a unified memory architecture.

It has a good old fashioned orthodox memory heiarchy. (Uma attempts to cheap out on two seperate pools by sharing one pool, while newer dynamic video memory technology attempts to cheap out all the way by using dynamically regulated high bandwidth low latency gddr5 main memory instead of dedicated video/cpu caches) Cpu and gpu do not share a memory pool, that 32Mb on the gpu is all gpu. The cpu has a tiny (less than 1 Mb) l1 with psychotic bandwidth and low latency, and another 'tiny' l2 (considerably larger but also under 1Mb) with nearly as high bandwidth and as low latency to buffer between the cpu and the low speed high capacity main ram.

banner1_zpsb47e46d2.png

 


#40 Arkhandar

Arkhandar

    Dry Bones

  • Members
  • 479 posts
  • Fandom:
    Zelda, Metroid, Mario, Kirby, DK

Posted 26 November 2012 - 01:40 PM


I'm terribly sorry. I didn't know you had such an horrible disease.
If you try to fail and succeed, which have you done?

Posted Image




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

Anti-Spam Bots!