Jump to content


Photo

Shin'en Explains Wii U EDRAM Usage


  • Please log in to reply
118 replies to this topic

#41 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 22 July 2013 - 06:28 AM

12.8/30 or 60 would get you around the numbers you specified.

Kind of. Though its not nearly as bad as it sounds. Its not like it only has only 200Mb available per frame, thats just how much can be transferred from the 1Gb holding pool per frame... And, its more than ps360 got, as their main mem bandwidth was destroyed by having to compensate the frame buffers and other bandwidth hogs via main mem bandwidth. Its why screen tearing was so horrible on those systems.

However, the most repeatedly used textures, will be kept in the leftover room of the edram, with a bandwidth in the triple digits, so even the heaviest hitting bandwidth hogs of texture assets are removed from clogging that bandwidth.

 

Correct me if I am wrong on this, but dont GPU's have built in cache on every SPU?  I honestly dont know the specifics of this, but it looks like GPU's have texture cache memory, and that from the looks of the GPU pics, it looked as though the Wii U GPU has a lot more texture cache than the 360/PS3 gpu did.  Basically this cuts down on the texture fetches from the main memory.  On a frame to frame basis, only so many pixels will change, so the cache is there for those pixels that are not changing in the next frame.  If anyone knows more about how this works, or if I am completely off base, please chime in. 



#42 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 22 July 2013 - 08:21 AM

Correct me if I am wrong on this, but dont GPU's have built in cache on every SPU?  I honestly dont know the specifics of this, but it looks like GPU's have texture cache memory, and that from the looks of the GPU pics, it looked as though the Wii U GPU has a lot more texture cache than the 360/PS3 gpu did.  Basically this cuts down on the texture fetches from the main memory.  On a frame to frame basis, only so many pixels will change, so the cache is there for those pixels that are not changing in the next frame.  If anyone knows more about how this works, or if I am completely off base, please chime in.


My technical knowledge on this is likely dated by several years, but i would imagine the fundamentals would still apply.

Those caches exist, or should exist primarily for accelerating texture filtering) and shaders and whatnot now). They are typically only as large as the filter kernel for the texture sampler. Those textures are in there... Just in tons of tiny peices, typically just a few texels (a textures smallest element, or a texture pixel) each. And then the memory is also needed for performing the arithmatic. So... I personally wouldnt really think those would be a place for storing texture data....

Maybe wii u takes mega samples? That would definately help in a way with a similar outcome to what youve proposed. I would imagine that extra memory would also be quite nice for sequential compute operations as well.

banner1_zpsb47e46d2.png

 


#43 grahamf

grahamf

    The Happiness Fairy

  • Members
  • 2,532 posts

Posted 22 July 2013 - 09:59 AM

no because again your ignoring the facts ITS ALL ON MCM THE MAIN  RAM ISNT USED AS YOUR THINKING

 

its used a s a feeder not a MAIN RAM your rendering etc is ON CHIP as long as you have space to catch disc/drive data into thats all you need

 

devs have already stated wiiu ram is kinda unlimited in that regards YOUR STREAMING IN not main ram processing at all

 

Aside from the all caps that is a good point.

 

The Xbonee and PS4 have massive amounts of RAM because they have to load everything into memory ahead of time.

The Wii U can pull resources as needed, so they don't need to be stored in main memory all of the time.

 

This would mean minimal in-game loading times, which seems to be usually the case (except Lego City Undercover. not sure why)


$̵̵͙͎̹̝̙̼̻̱͖̲̖̜̩̫̩̼̥͓̳̒̀ͨ̌̅ͮ̇̓ͮ̈͌̓̔̐͆ͩ̋͆ͣ́&̾̋͗̏̌̓̍ͥ̉ͧͣͪ̃̓̇̑҉͎̬͞^̸̠̬̙̹̰̬̗̲͈͈̼̯̞̻͎ͭ̐ͦ̋́̆̔̏̽͢$̻̜͕̜̠͔̮͐ͬ̍ͨͩͤͫ͐ͧ̔̆͘͝͞^̄̋̄͗̐ͯͮͨͣ͐͂͑̽ͩ͒̈̚͏̷͏̗͈̣̪͙̳̰͉͉̯̲̘̮̣̘͟ͅ&̐ͪͬ̑̂̀̓͛̈́͌҉҉̶̕͝*̗̩͚͍͇͔̻̬̼̖͖͈͍̝̻̪͙̳̯̌̅̆̌ͥ̊͗͆́̍ͨ̎̊̌͟͡$̶̛̛̙̝̥̳̥̣̥̞̝̱̺͍̭̹̞͔̠̰͇ͪ͋͛̍̊̋͒̓̿ͩͪ̓̓͘^̈ͥͩͭ͆͌ͣ̀̿͌ͫ̈́̍ͨ̇̾̚͏̢̗̼̻̲̱͇͙̝͉͝ͅ$̢̨̪̝̗̰͖̠̜̳̭̀ͥͭͨ̋ͪ̍̈ͮͣ̌^ͦ̏ͬ̋͑̿́ͮ̿ͨ̋̌ͪ̓̋̇͆͟҉̗͍$̛̪̞̤͉̬͙̦̋ͣͬ̒͗̀̍͗̾̽̓̉͌̔͂̇͒̚̕͜^̧͎̖̟̮͚̞̜̮̘͕̹͚̏ͩ͐ͯ͑̍̍̀͒͘*̿ͨ̽̈́͐ͭ̌̈͋̚͟͝҉͕̙*̨̢̭̭̤̺̦̩̫̲͇͕̼̝̯̇ͨ͗̓̃͂ͩ͆͂̅̀̀́̚̚͟%̨͚̙̮̣̭͖͕͙ͣ̽ͮͤ́ͫ̊̊̐̄̌ͣ͌̉̔͊̽̾ͨ^̢̹̭͍̬̖͇̝̝̬̱͈͔̹͉̫̿͛̄̿͊͆ͦ̃ͮͩ͌ͭ̔ͫ̆͞ͅͅ%̵̼̖̻̘ͪͤ̈̃̓̐̑ͩͭ̄̑͊ͫ̆̌̄͡*̴̮̪͕̗̩͇͇ͪ̑̊̈́́̀͞^̼̝̥̦͇̺̘̤̦͕̦̞͑̑ͯ̂ͯ̕͞%ͮͫ̿ͫ̊̈̔̍҉҉̴̸̡*̛̭̖͇͚̝̤̬̰̅̎ͥͯ̓͑̾ͬͨͮ́̕͝^̧̽͋̈ͤͮ̈́́̍ͧ̊҉͇̙̣̯̀́%̴̡̛̘͚͈̗̖̮̫̏̆ͦ̽̔̈̽͒͛̈

 


#44 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 22 July 2013 - 11:49 AM

Aside from the all caps that is a good point.
 
The Xbonee and PS4 have massive amounts of RAM because they have to load everything into memory ahead of time.
The Wii U can pull resources as needed, so they don't need to be stored in main memory all of the time.
 
This would mean minimal in-game loading times, which seems to be usually the case (except Lego City Undercover. not sure why)


Xbone also has 32Mb of embedded ram.

banner1_zpsb47e46d2.png

 


#45 grahamf

grahamf

    The Happiness Fairy

  • Members
  • 2,532 posts

Posted 22 July 2013 - 03:04 PM

Xbone also has 32Mb of embedded ram.

Point being?


$̵̵͙͎̹̝̙̼̻̱͖̲̖̜̩̫̩̼̥͓̳̒̀ͨ̌̅ͮ̇̓ͮ̈͌̓̔̐͆ͩ̋͆ͣ́&̾̋͗̏̌̓̍ͥ̉ͧͣͪ̃̓̇̑҉͎̬͞^̸̠̬̙̹̰̬̗̲͈͈̼̯̞̻͎ͭ̐ͦ̋́̆̔̏̽͢$̻̜͕̜̠͔̮͐ͬ̍ͨͩͤͫ͐ͧ̔̆͘͝͞^̄̋̄͗̐ͯͮͨͣ͐͂͑̽ͩ͒̈̚͏̷͏̗͈̣̪͙̳̰͉͉̯̲̘̮̣̘͟ͅ&̐ͪͬ̑̂̀̓͛̈́͌҉҉̶̕͝*̗̩͚͍͇͔̻̬̼̖͖͈͍̝̻̪͙̳̯̌̅̆̌ͥ̊͗͆́̍ͨ̎̊̌͟͡$̶̛̛̙̝̥̳̥̣̥̞̝̱̺͍̭̹̞͔̠̰͇ͪ͋͛̍̊̋͒̓̿ͩͪ̓̓͘^̈ͥͩͭ͆͌ͣ̀̿͌ͫ̈́̍ͨ̇̾̚͏̢̗̼̻̲̱͇͙̝͉͝ͅ$̢̨̪̝̗̰͖̠̜̳̭̀ͥͭͨ̋ͪ̍̈ͮͣ̌^ͦ̏ͬ̋͑̿́ͮ̿ͨ̋̌ͪ̓̋̇͆͟҉̗͍$̛̪̞̤͉̬͙̦̋ͣͬ̒͗̀̍͗̾̽̓̉͌̔͂̇͒̚̕͜^̧͎̖̟̮͚̞̜̮̘͕̹͚̏ͩ͐ͯ͑̍̍̀͒͘*̿ͨ̽̈́͐ͭ̌̈͋̚͟͝҉͕̙*̨̢̭̭̤̺̦̩̫̲͇͕̼̝̯̇ͨ͗̓̃͂ͩ͆͂̅̀̀́̚̚͟%̨͚̙̮̣̭͖͕͙ͣ̽ͮͤ́ͫ̊̊̐̄̌ͣ͌̉̔͊̽̾ͨ^̢̹̭͍̬̖͇̝̝̬̱͈͔̹͉̫̿͛̄̿͊͆ͦ̃ͮͩ͌ͭ̔ͫ̆͞ͅͅ%̵̼̖̻̘ͪͤ̈̃̓̐̑ͩͭ̄̑͊ͫ̆̌̄͡*̴̮̪͕̗̩͇͇ͪ̑̊̈́́̀͞^̼̝̥̦͇̺̘̤̦͕̦̞͑̑ͯ̂ͯ̕͞%ͮͫ̿ͫ̊̈̔̍҉҉̴̸̡*̛̭̖͇͚̝̤̬̰̅̎ͥͯ̓͑̾ͬͨͮ́̕͝^̧̽͋̈ͤͮ̈́́̍ͧ̊҉͇̙̣̯̀́%̴̡̛̘͚͈̗̖̮̫̏̆ͦ̽̔̈̽͒͛̈

 


#46 Arkhandar

Arkhandar

    Dry Bones

  • Members
  • 479 posts
  • Fandom:
    Zelda, Metroid, Mario, Kirby, DK

Posted 22 July 2013 - 04:40 PM

no because again your ignoring the facts ITS ALL ON MCM THE MAIN  RAM ISNT USED AS YOUR THINKING

 

its used a s a feeder not a MAIN RAM your rendering etc is ON CHIP as long as you have space to catch disc/drive data into thats all you need

 

devs have already stated wiiu ram is kinda unlimited in that regards YOUR STREAMING IN not main ram processing at all



no because each game works withion its own standards any re formating of os is auto matic etc

 

so lets say nintendo says in a direct we have found 10% more processing power for devs to use and have unlocked 512mb of extrra ram its up to devs if they want to use it

 

it has no bearing on us playing those games they simply just work

 

If the Wii U only needed to stream content form the disc then why does it have such horrible loading times for most games? It would render the main RAM pretty much useless.


Edited by Arkhandar, 22 July 2013 - 04:40 PM.

If you try to fail and succeed, which have you done?

Posted Image

#47 Cloud Windfoot Omega

Cloud Windfoot Omega

    Cheep-Cheep

  • Members
  • 148 posts

Posted 22 July 2013 - 05:05 PM

If the Wii U only needed to stream content form the disc then why does it have such horrible loading times for most games? It would render the main RAM pretty much useless.

Because not many devs are using the system correctly.  When you take a streaming system and try to make it a loading system, and do not use the buffers you get long loads



#48 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 22 July 2013 - 09:13 PM

Point being?


Its actually using a similar memory heiarchy set up as the wii u. Just with a lot more main ram.

Wii u seems a little more complicated, with both edram and 2 types of sram/psram.

Ps4 is kinda by itself on.that end.

banner1_zpsb47e46d2.png

 


#49 GAMER1984

GAMER1984

    Lakitu

  • Members
  • 2,036 posts
  • NNID:gamer1984
  • Fandom:
    Nintendo

Posted 22 July 2013 - 09:16 PM

Its actually using a similar memory heiarchy set up as the wii u. Just with a lot more main ram.

Wii u seems a little more complicated, with both edram and 2 types of sram/psram.

Ps4 is kinda by itself on.that end.

im just so worried about the ports coming to Wii U this year. If they struggle to keep up or dont run better than 360 versio it will be a naysayers wet dream. i enjoy all the things you are saying about the hardware but if developers(3rd party) cant or wont make the best out of the advantages it has over ps360 whats the point.



#50 Cloud Windfoot Omega

Cloud Windfoot Omega

    Cheep-Cheep

  • Members
  • 148 posts

Posted 22 July 2013 - 11:41 PM

im just so worried about the ports coming to Wii U this year. If they struggle to keep up or dont run better than 360 versio it will be a naysayers wet dream. i enjoy all the things you are saying about the hardware but if developers(3rd party) cant or wont make the best out of the advantages it has over ps360 whats the point.

Most those games run amazingly seeing as they are not optimized for the system at all, mostly use 1/3 cores, dont use DPS and are  loaded  rather than streamed.

 

 

though telling  them that now is useless, they wont even try to learn.


Edited by Cloud Windfoot Omega, 22 July 2013 - 11:43 PM.


#51 GAMER1984

GAMER1984

    Lakitu

  • Members
  • 2,036 posts
  • NNID:gamer1984
  • Fandom:
    Nintendo

Posted 23 July 2013 - 12:17 AM

Most those games run amazingly seeing as they are not optimized for the system at all, mostly use 1/3 cores, dont use DPS and are  loaded  rather than streamed.

 

 

though telling  them that now is useless, they wont even try to learn.

 

exactly what im saying.... im so nervous for these upcoming ports. splinter cell, AC4, COD ghost, watchdogs.... nintendo will have an even more negative cloud over its head if after 9 months developers still cant get games on Wii U running better than same game on 7 year old hardware.



#52 Cloud Windfoot Omega

Cloud Windfoot Omega

    Cheep-Cheep

  • Members
  • 148 posts

Posted 23 July 2013 - 01:33 AM

exactly what im saying.... im so nervous for these upcoming ports. splinter cell, AC4, COD ghost, watchdogs.... nintendo will have an even more negative cloud over its head if after 9 months developers still cant get games on Wii U running better than same game on 7 year old hardware.

I can see Ubi learning the hardware enough to make ports and  main titles,  they just do not think there is enough to bet any   exclusive game. COD on the other hand, lost most their best programmers thanks to the fiasco that went on.



#53 GAMER1984

GAMER1984

    Lakitu

  • Members
  • 2,036 posts
  • NNID:gamer1984
  • Fandom:
    Nintendo

Posted 23 July 2013 - 01:45 AM

I can see Ubi learning the hardware enough to make ports and  main titles,  they just do not think there is enough to bet any   exclusive game. COD on the other hand, lost most their best programmers thanks to the fiasco that went on.

 

 

we will see but with what i know the Wii u is capable of and seeing that Bayonetta 2 demo from E3... its just hard for me to go out and spend 60 dollars on a Wii U game that is barely up to 360 standards. Shin'en has gone on record multiple times nowstating how to get the most out of the console and if Ubi is having problems they are big enough to go or call up nintendo techs and say help us get the game running smooth on Wii U.



#54 BanjoKazooie

BanjoKazooie

    Witch Slayer

  • Members
  • 1,258 posts
  • Fandom:
    LOZ, Mario, Elder Scrolls

Posted 23 July 2013 - 05:02 AM

WIIU IS THE LOWEST LATENCY GAMING DEVICE EVER MADE even beating gamecube and wii
 
FACT AMD APU's are very high latency and ram bottle necking
 
IBM POWER MCM is very low latency and free of bottlenecks BIG HUGE PHAT DIFFERANCE IN DESIGN STANDARDS
 
wiiu has a custom from scratch video gaming game centric MCM /////ps4 and xbone have hacked APUs based on VERY SLOW NON GAMING SYSTEM ON CHIPS *fact*
 
 
look at ps4s hacked memory system its insulting
 
slowest latency ram they could have chose,,no edram or custom fast ram catches of any kind,,bottlenecked high latency bus system and a cpu bus that gpu data has to share and ether go thru cpu catch or thu a slow by-pass ITS A HACK JOB PURE AND SIMPLE
 
ITS CHEAP SONY DID IT """ON THE CHEAP"""
GFLOPS YOU SAY SONY HHHHHHHHMMMmmmmmmmm!!!!!!!!!!!!!
 
SIMPLE TRUTH ""YOU CANNOT PROCESS WHAT ISNT THERE """"
 
WIIU MCM CATCH LIKE LATENCY VS AMD SOC APU OUTRAGOUS LATENCY
 
WIIU DDR3 WITH CUSTOM PRE FETCH = LOW LATENCY VS PS4 GDDR5 WITCH IS EXTREAMLY HIGH LATENCY
 
wiiu edram and sram multipul catches and confirmed cpu access to gpu edram = CATCH LIKE LATENCY vs ps4s read and write out to main ram for EVERYTHING high latency
 
this is so simple sony are lying thru there teeth high latency apu high latency bus and high latency ram and tiny catches !!!!!!!!
 
wiiu low latency ram low latency catches low latency mcm and low latency scratch pad edram for both cpu and gpu
 
 
 
SUB 5NS LATENCY vs 200NS LATENCY =40 x faster for wiiu MCM SPEEDS
 
sub 20ns latency ddr3 with custom pre fetch wiiu vs 200ns gddr5 ps4
 
simple truth :ph34r:

Do to happen to know thehappening?? Because he also made posts just like this and called Cache catch, just like you. He also would seemingly randomly turn on caps lock and never use punctuation. Hmmmm, more than suspicious to me.
Here is his page: http://thewiiu.com/u...1-thehappening/

I think you are thehappening!

post-1466-0-36015200-1348103349.png

I was once known here as KillerMario, but since I really like Banjo-Kazooie, I changed my display name to show them my respect :)


#55 Arkhandar

Arkhandar

    Dry Bones

  • Members
  • 479 posts
  • Fandom:
    Zelda, Metroid, Mario, Kirby, DK

Posted 23 July 2013 - 07:44 AM

Do to happen to know thehappening?? Because he also made posts just like this and called Cache catch, just like you. He also would seemingly randomly turn on caps lock and never use punctuation. Hmmmm, more than suspicious to me.
Here is his page: http://thewiiu.com/u...1-thehappening/

I think you are thehappening!

Obviously. I missed the guy and his random CAPS xD


If you try to fail and succeed, which have you done?

Posted Image

#56 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 23 July 2013 - 08:41 AM

Its actually using a similar memory heiarchy set up as the wii u. Just with a lot more main ram.

Wii u seems a little more complicated, with both edram and 2 types of sram/psram.

Ps4 is kinda by itself on.that end.

 

X1 doesnt lean as heavily on the 32 of on board S-ram though, although I do think that developers would get the most bang for their buck by using it in a similar fashion to the edram in the Wii U.  Do all the frame buffers in the Sram, and now the X1 has a ton of bandwidth for reading from the main memory.  As impressive as the GDDR5 bandwidth is, its not going to outperform the X1 is real world memory performance.  Not to mention that the latency for the DDR3 is far better than the GDDR5 in the PS4.  

 

Its still not clear if those extra banks of sram in the Wii U gpu are accessible for Wii U games or not.  So far speculation has been that it is there for backwards compatability only, and there is a high probability that this is the case, but if possible Nintendo really needs to open that up to Wii U games as well.  Every MB of on board ram could become extremely valuable as time goes on.   



#57 Cloud Windfoot Omega

Cloud Windfoot Omega

    Cheep-Cheep

  • Members
  • 148 posts

Posted 23 July 2013 - 10:08 AM

X1 doesnt lean as heavily on the 32 of on board S-ram though, although I do think that developers would get the most bang for their buck by using it in a similar fashion to the edram in the Wii U.  Do all the frame buffers in the Sram, and now the X1 has a ton of bandwidth for reading from the main memory.  As impressive as the GDDR5 bandwidth is, its not going to outperform the X1 is real world memory performance.  Not to mention that the latency for the DDR3 is far better than the GDDR5 in the PS4.  

 

Its still not clear if those extra banks of sram in the Wii U gpu are accessible for Wii U games or not.  So far speculation has been that it is there for backwards compatability only, and there is a high probability that this is the case, but if possible Nintendo really needs to open that up to Wii U games as well.  Every MB of on board ram could become extremely valuable as time goes on.   

Nintnedo themselves have said every part of the GPU and CPU design was  made to be useable in both BC and new.



#58 alan123

alan123

    Piranha Plant

  • Members
  • 889 posts

Posted 23 July 2013 - 01:24 PM

why don't Nintendo just come out & explain all this ?



#59 Arkhandar

Arkhandar

    Dry Bones

  • Members
  • 479 posts
  • Fandom:
    Zelda, Metroid, Mario, Kirby, DK

Posted 24 July 2013 - 07:14 AM

The GPU still sucks though.


If you try to fail and succeed, which have you done?

Posted Image

#60 Cloud Windfoot Omega

Cloud Windfoot Omega

    Cheep-Cheep

  • Members
  • 148 posts

Posted 24 July 2013 - 11:22 AM

The GPU still sucks though.

the gpu we dont know much about?






2 user(s) are reading this topic

0 members, 2 guests, 0 anonymous users

Anti-Spam Bots!