Jump to content


Photo

Wii U's RAM is slower than PS3/Xbox 360.


  • Please log in to reply
270 replies to this topic

#241 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 09 April 2013 - 09:09 AM

I dont know, and I dont know that we will ever know.  Remember back in the day when IGN would get these kinds of things answered, even if it was an anoynmous source.  That doesnt happen anymore.  How can we have so many people working on the Wii U every day and no one is willing to divulge this information.  Personally I wish it was a consumers right to know specifications of a product.

Technically it is, but for integrated systems like consoles, routers/switches, firewalls, etc.  the company has a right to protect their proprietary methods and business practices.  



#242 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 10 April 2013 - 12:27 PM

I thought of something about why the address bus to the ram might be a little funky, could the CPU have dedicated lanes to the memory? 



#243 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 10 April 2013 - 12:40 PM

I thought of something about why the address bus to the ram might be a little funky, could the CPU have dedicated lanes to the memory? 

Unlikely, the DDR3 IO pins on the GPU would be much fewer if that were the case. If there are two memory controllers accessing the same RAM at the same time, hold up... would interleaving be able to give them full bus width to both the GPU and CPU at the same time?  I need to research this, I've never heard of a setup like that before.



#244 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 10 April 2013 - 01:14 PM

I assume the cpu goes through the gpu via the gp i/o's on cpu side of gpu to access main ram, as i cant seem to find any manner of seperate controller/bridge. Could be hidden from view though.

Has anyone identified a dedicated northbridge/memory controller of any kind?

banner1_zpsb47e46d2.png

 


#245 Alex Atkin UK

Alex Atkin UK

    Boo

  • Members
  • 528 posts

Posted 10 April 2013 - 01:48 PM

It would rather crap on the whole point of unified memory if there was any difference between what speed the GPU and what speed the CPU can access it.  The whole point is that both can use however much of the memory, size and bandwidth, as they need. 

 

This has been one of Xbox 360s strengths over the PS3, that you get to use memory however is best for the game, be that heavy on the GPU side or heavy on the CPU side.  This is even more relevant with GPGPU where the same activity in one game might be using the CPU, on another they might offload it to the GPU.  I believe that was also a problem with the PS3, the SPEs had very poor access to system RAM so it was really hard to get them to actually be useful.


Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK

 

How to improve the Wii U download speed.


#246 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 10 April 2013 - 05:07 PM

The setup may be completely latency driven then.  Everyone always thinks bandwidth is king, but latency effects effective bandwidth.  If your losing lots and lots of cycles every time the cpu calls to the main memory due to higher latency, then your effectively losing performance.  Shin'en was pretty precise about Nintendo coming up with clever ways to reduce memory latency, maybe their controller setup has something to do with that.  Not everything requested is large chunks of data, there are tons of small data request as well, so by keeping latency to a minimum, your essentially improving your bandwidth at the same time.  Like I said before, the z buffer and frame buffer bandwidth hogs arent going to be in the main memory, so if the main memory really is "read only", then latency becomes a much more important aspect to the system performance.  Just based on the increased capacity in ram, the Wii U should be able to use mip maps more effectively than the 360/PS3.  Those consoles were so tight on memory capacity that storing the mip maps may have been out of the question. 



#247 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 10 April 2013 - 05:54 PM

I assume the cpu goes through the gpu via the gp i/o's on cpu side of gpu to access main ram, as i cant seem to find any manner of seperate controller/bridge. Could be hidden from view though.

Has anyone identified a dedicated northbridge/memory controller of any kind?

maybe on gp IOs but I don't see it.  They both have to talk to the NB as well, which is on die.



#248 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 11 April 2013 - 07:33 AM

The very back end of the pipeline, raster operations (often called the ROP), is responsible for reading and writing depth and stencil, doing the depth and stencil comparisons, reading and writing color, and doing alpha blending and testing. As you can see, much of the ROP workload taxes the available frame-buffer bandwidth.

 

The final stage in the pipeline, ROP, interfaces directly with the frame-buffer memory and is the single largest consumer of frame-buffer bandwidth. For this reason, if bandwidth is an issue in your application, it can often be traced to the ROP.

 

http://http.develope...ugems_ch28.html

 

This is a pretty good read.  Its hard to fully wrap your head around the Nintendo's memory concept until you can find out what exactly taxes the memory the most.  The more I read, the more I am convinced that even though the edram is a small memory pool, it effectively eliminates the biggest memory bandwidth offenders.  This is the type of information that isnt typically published in articles.  Yes, we all know memory bandwidth is crucial and can cripple performance, but what specifically causes the majority of the load.  It could litterally be a small piece of data, but if the GPU has to access it 500 time per frame, it could be a huge bandwidth hog.  That was a generic example, but you get the point. 



#249 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 11 April 2013 - 09:48 AM


Goodtwin, on 11 Apr 2013 - 01:47, said:http://http.develope...ugems_ch28.html
This is a pretty good read.  Its hard to fully wrap your head around the Nintendo's memory concept until you can find out what exactly taxes the memory the most.  The more I read, the more I am convinced that even though the edram is a small memory pool, it effectively eliminates the biggest memory bandwidth offenders.  This is the type of information that isnt typically published in articles.  Yes, we all know memory bandwidth is crucial and can cripple performance, but what specifically causes the majority of the load.  It could litterally be a small piece of data, but if the GPU has to access it 500 time per frame, it could be a huge bandwidth hog.  That was a generic example, but you get the point. 




Im afraid I cant quite identify how many rops are in the gpu yet..... So i dont have an idea exactly how much the process to final rasterization costs in overhead.

Im mostly concerned with asset streaming, and whether or not the only limiting factor of the wii u being able to have bigger worlds, more assets larger textures, more layers, etc would be the bandwidth from main ram. Definately indirectly affected by the things youve brought up as per your points.

Our apparant goal would be positive identification of freeing up another 12 or more GB/s of bandwidth via the larger edram capacity and bandwidth removing overhead from main mem bw.... And then finding out whether or not the wii u could do a worthy amount more... if it just had more bandwidth



goal correction, the overhead freed up would have to enable asset streaming capabilities greater than what bandwidth the 360 had for streaming game assets.

banner1_zpsb47e46d2.png

 


#250 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 11 April 2013 - 10:14 AM

Thats a good question, and its hard to pinpoint what the percentage of bandwidth was used for assets from the main memory versus everything else.  A capacity increase even without more bandwidth isnt to be dissmissed though.  For example, with more capacity you can store mip map textures for just about everything.  This would allow for the use of higher res up close textures with smaller and smaller textures used as the objects get farther away.  This allows the gamer to experience very high detail up close, while drastically reducing texture size further away.  So extra ram is definately good even if it doesnt bring more bandwidth to the equation.  I still think the bandwidth is for assets is good.  Remember when Ubisoft said they forgot compress the textures and everything still ran fine, if bandwidth to the ram was marginal, then that should have put perfomrance in the cellar. 



#251 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 11 April 2013 - 10:30 AM

The capacity increase only helps with lod if we have the badwidth to send the assets through.

X shows that this is not an issue though.


Very good point with rayman though. Very good. So what i want to find is the 'effective' bandwidth wii u has over 360. It will be a while.

Also, ive sifted through hynix's wiring diagram for the ram.

There are 96 pins per module that need to be hooked up. 16 are i/o pins, these would be going from the chip to the mcm pin to pin. They must be insulated under different layers of the substrate. These 16 pins get 2-bits per clock.

So now, why are there so many extra pins on the ddr3 interface?

banner1_zpsb47e46d2.png

 


#252 NintendoReport

NintendoReport

    NintendoChitChat

  • Moderators
  • 5,907 posts
  • NNID:eddyray
  • Fandom:
    Nintendo Directs and Video Presentations

Posted 11 April 2013 - 10:42 AM

The capacity increase only helps with lod if we have the badwidth to send the assets through.

X shows that this is not an issue though.


Very good point with rayman though. Very good. So what i want to find is the 'effective' bandwidth wii u has over 360. It will be a while.

Also, ive sifted through hynix's wiring diagram for the ram.

There are 96 pins per module that need to be hooked up. 16 are i/o pins, these would be going from the chip to the mcm pin to pin. They must be insulated under different layers of the substrate. These 16 pins get 2-bits per clock.

So now, why are there so many extra pins on the ddr3 interface?

 

Are we back to the same question from several pages ago?  .. i am enjoying it though and appreciate the comments. 

 

Gifs+of+the+net+funny+penn+and+teller+cu


Keep Smiling, It Makes People Wonder What You Are Up To!
PA Magician | Busiest PA Magician | Magician Reviewed | Certified Magic Professionals

nccbanner_by_sorceror12-d9japra.png-- nintendoreportbox.png -- nintendo_switch_logo_transparent___wordm

#253 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 11 April 2013 - 10:45 AM

Thats a good question, and its hard to pinpoint what the percentage of bandwidth was used for assets from the main memory versus everything else.  A capacity increase even without more bandwidth isnt to be dissmissed though.  For example, with more capacity you can store mip map textures for just about everything.  This would allow for the use of higher res up close textures with smaller and smaller textures used as the objects get farther away.  This allows the gamer to experience very high detail up close, while drastically reducing texture size further away.  So extra ram is definately good even if it doesnt bring more bandwidth to the equation.  I still think the bandwidth is for assets is good.  Remember when Ubisoft said they forgot compress the textures and everything still ran fine, if bandwidth to the ram was marginal, then that should have put perfomrance in the cellar. 

You still have to rely on the main RAM bandwidth to stream those textures, especially if you are swapping them out constantly for LOD, there has to be something, even if it isn't increasing actual bandwidth per pin, that increases the effective bandwidth from the software perspective.

 

You're right, with only 12.8Gb/s running uncompressed textures would have absolutely murdered performance.  Everything points to a higher bandwidth, or at least a higher effective bandwidth, other than what we know about the chips themselves.


Edited by routerbad, 11 April 2013 - 10:48 AM.


#254 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 11 April 2013 - 12:05 PM

The 360/PS3 have 22GB/s bandwidth, but if 70% of that is used for things that arent textures and assets then all of sudden we are back in the ball park.  Lets be realistic though, the Wii U isnt miles above the 360 and PS3, so expect the effective bandwidth to be more or less in line with the rest of the compenents.  Thats why I wanted to find a graphics card that was limited to 12.8GB/s bandwidth and see how that performed, and the AMD HD5450 meets the criteria.  Keep in mind that card obviously has zero edram, so its 100% reliant on the 12.8GB/s of memory bandwidth for both read and writes.  You will find that you can turn up settings pretty high with this card as long as you sacrafice resolution.  However, even when turning the other settings down, if you bump up the resolution the cards performance starts to plummit.  Bassically, the card even with only 12.8GB/s can run modern game on high settings as long as they sacrafice resolution.  Obviously this isnt all dependent on just the memory, but if high quality textures and assets required more bandwidth than 12.8GB/s, it would show up even in 600P.



#255 GAMER1984

GAMER1984

    Lakitu

  • Members
  • 2,036 posts
  • NNID:gamer1984
  • Fandom:
    Nintendo

Posted 11 April 2013 - 12:21 PM

Question for you guys regarding the RAM. Why in the hell didn't Nintendo give us more. There is a rumor of GAF that ps4 is only gonna take up 1 gb for OS so that SEVEN GB for games. Who the hell did Nintendo consult about what developers want next gen. Like how expensive is ram compared to GPU and CPU.... If I'm not mistaken its the most expensive part right?

Edited by GAMER1984, 11 April 2013 - 12:22 PM.


#256 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 11 April 2013 - 12:45 PM


GAMER1984, on 11 Apr 2013 - 06:35, said:Question for you guys regarding the RAM. Why in the hell didn't Nintendo give us more. There is a rumor of GAF that ps4 is only gonna take up 1 gb for OS so that SEVEN GB for games. Who the hell did Nintendo consult about what developers want next gen. Like how expensive is ram compared to GPU and CPU.... If I'm not mistaken its the most expensive part right?


Gaf doesnt even know what an os actually is.

the actual operating system will be small enough to fit in cpu cache.

What gaf is talking about is the graphical user interface and the system running processes.

And no, that doesnt mean the games will get 7GB all to themelves. The ps4 will have a lot more functionality than the ps3 (and certainly wii u), and they will be in the form of applications and features that will require a peice of that ram pie.

How much is left over depends on how feature rich sony is planning on making the ps4. No matter how much they take, it will have a lot more than wii u, and a lot more than devs will be able to use for a considerable amount of time. Ruling out of course the convenience of not giving a crap about memory management.


banner1_zpsb47e46d2.png

 


#257 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 11 April 2013 - 12:52 PM

Cost, and cost only.  Keep in mind that Nintendo created a system that has a $100+ controller, so the actual console itself is a $200 piece of hardware. 



#258 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 11 April 2013 - 01:06 PM


GAMER1984, on 11 Apr 2013 - 06:35, said:Question for you guys regarding the RAM. Why in the hell didn't Nintendo give us more. There is a rumor of GAF that ps4 is only gonna take up 1 gb for OS so that SEVEN GB for games. Who the hell did Nintendo consult about what developers want next gen. Like how expensive is ram compared to GPU and CPU.... If I'm not mistaken its the most expensive part right?


Gaf doesnt even know what an os actually is.

the actual operating system will be small enough to fit in cpu cache.

What gaf is talking about is the graphical user interface and the system running processes.

And no, that doesnt mean the games will get 7GB all to themelves. The ps4 will have a lot more functionality than the ps3 (and certainly wii u), and they will be in the form of applications and features that will require a peice of that ram pie.

How much is left over depends on how feature rich sony is planning on making the ps4. No matter how much they take, it will have a lot more than wii u, and a lot more than devs will be able to use for a considerable amount of time. Ruling out of course the convenience of not giving a crap about memory management.

I seriously doubt developers will actually have 7GB of usable RAM to play with.  I'll stretch to 4GB for software, but when you have the possible situation of running the OS (GUI, I/O calls) + Applications (social media applications running, media services running) + live streaming 1080p (or even 720P) + encoding and storing (on RAM scratch media) A/V and constantly updating that information + PSN applications + downloading updates or full games + Playing a game + suspending a full game for instant recovery + suspending ANOTHER game (assuming they'll use the hard drive to store the first suspended game in this situation, slower recovery, still functional), it is going to be using a lot of RAM.  Media encoding in particular, even without using RAM as a scratch disk, is very memory instensive.  They mentioned it would be in the same resolution that the game is in and at the same framerate.

 

Encoding A/V will eat up as much RAM as you feed the process.  7GB would do nothing but allow developers to be a little lazier about memory utilization.  They won't fill it with game code, for several years, especially as the AAA game model is starting to collapse in on itself with the costs of development skyrocketing.



Cost, and cost only.  Keep in mind that Nintendo created a system that has a $100+ controller, so the actual console itself is a $200 piece of hardware. 

And the guy that made the die shot said the GPU was probably somewhere in the $100 range to manufacture.



#259 Alex Atkin UK

Alex Atkin UK

    Boo

  • Members
  • 528 posts

Posted 11 April 2013 - 02:26 PM

The video encoding is unlikely to use much RAM because its meant to be done in dedicated hardware, but more importantly it has to be real-time, like the WiiU GamePad.

 

Also we do not know how much of the background processes will run on the dedicated ARM core, and THAT I would expect to have its own RAM also.  So a lot of background processes that ordinarily might take up game resources are being offloaded onto dedicated hardware in the PS4.

 

However I do agree with one thing, the extra RAM will mostly be used to avoid having to optimise games memory usage.  However, if that means we end up with games NOT plagued with issues (eg Skyrim PS3) due to struggling to manage the RAM properly, and gets games out quicker, then I'm all for it.

 

At the end of the day, I think Sony realised they messed up with the PS3 and with the PS4 they are throwing everything but in the kitchen sink in there to avoid the same problems again.  The console is going to be EXTREMELY scalable. 

 

That said, I think Nintendo to some extent have done the same with the Wii U by keeping 1GB of RAM for the OS/system.  It seems highly unlikely its using much of that today but as they were rushing to get the console released on time they held back a lot of RAM so they could expand the functionality over time without running into another Sony problem, not enough RAM reserved for the OS to add all the functionality the end users wanted.

 

I do not think having 7GB of RAM for game use is a stretch at all, there are already games on PC that run into the 4GB 32bit addressing problems.  As all games on PC transition to 64bit its not at all unrealistic to see them eating 4GB of system RAM and 2GB of GPU RAM.  The idea is they will be able to then port these over to PS4 relatively easily, so allowing developers to use 6-7GB of RAM makes a lot of sense.


Edited by Alex Atkin UK, 11 April 2013 - 02:31 PM.

Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK

 

How to improve the Wii U download speed.


#260 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 11 April 2013 - 02:29 PM

The video encoding is unlikely to use much RAM because its meant to be done in dedicated hardware, but more importantly it has to be real-time, like the WiiU GamePad.

 

Also we do not know how much of the background processes will run on the dedicated ARM core, and THAT I would expect to have its own RAM also.  So a lot of background processes that ordinarily might take up game resources are being offloaded onto dedicated hardware in the PS4.

 

However I do agree with one thing, the extra RAM will mostly be used to avoid having to optimise games memory usage.  However, if that means we end up with games NOT plagued with issues (eg Skyrim PS3) due to struggling to manage the RAM properly, and gets games out quicker, then I'm all for it.

 

At the end of the day, I think Sony realised they messed up with the PS3 and with the PS4 they are throwing everything but in the kitchen sink in there to avoid the same problems again.  The console is going to be EXTREMELY scalable. 

 

That said, I think Nintendo to some extent have done the same with the Wii U by keeping 1GB of RAM for the OS/system.  It seems highly unlikely its using much of that today but as they were rushing to get the console released on time they held back a lot of RAM so they could expand the functionality over time without running into another Sony problem, not enough RAM reserved for the OS to add all the functionality the end users wanted.

The amount of on die cache on the ARM core won't cover what's needed.  Even for livestreaming RAM plays an important role, for encoding, it is even more important.  Keep in mind there are two separate funtions, livestreaming, and caching the previous 15~ minutes of gameplay.  It will have to use RAM as a scratch disk because HDD would be too slow for that operation at that speed, and it has to be able to be recalled at any point.






2 user(s) are reading this topic

0 members, 2 guests, 0 anonymous users

Anti-Spam Bots!