Jump to content


Photo

More Wii U hardware talk the breakdown (article)


  • Please log in to reply
21 replies to this topic

#1 Thugbrown

Thugbrown

    Spear Guy

  • Members
  • 98 posts

Posted 03 December 2012 - 07:41 PM

I know we are all probably tired of the Wii U power debates but I just wanted to share this article if you don't mind taking the time to read. it a little long but very informative.

http://www.digitally...-bit-means.html

#2 GAMER1984

GAMER1984

    Lakitu

  • Members
  • 2,036 posts
  • NNID:gamer1984
  • Fandom:
    Nintendo

Posted 03 December 2012 - 09:49 PM

LIKED the article

#3 Insanity_Boi

Insanity_Boi

    Bob-omb

  • Members
  • 251 posts
  • Fandom:
    Zelda Mario Metroid Mario Kart

Posted 03 December 2012 - 10:09 PM

Article was very informative. I learned a lot of stuff I never knew before thanks.

#4 Bunkei

Bunkei

    Red Koopa Troopa

  • Members
  • 55 posts

Posted 03 December 2012 - 10:13 PM

LIKED the article


Same. He also brings up an interesting point:

"In our recent article, Is the Wii U Powerful enough?, we took a deep look into the rumours surround the Wii U’s supposedly “slow” processor and found out that the idea of the Wii U being a true “next generation” console from a power standpoint was loosely based on it having a tri-core IBM POWER7-based architecture, which it didn’t ship out with...."

That would explain why IBM had to issue a statement correcting a previous tweet which stated that the Wii U's CPU is a custom Power 7 chip. The engineers weren't necessarily wrong, but they most likely didn't know that the final hardware specs were changed (to the 750 chipset) since they've last seen it. It looks like that at one point, it was going to have a quad core Power 7 chip, but Nintendo opted against it. However, it also seems likely that GPU was buffed to compensate.

Some would say the obvious reason was cost, but what about backwards compatibility? Perhaps Nintendo placed a high priority on making a seamless transition from Wii to Wii U? The Wii could play Gamecube discs and had GC ports, but it's not really a fair comparison since the Wii had a lot more peripherals than the Gamecube.

Edited by Bunkei, 03 December 2012 - 10:16 PM.


#5 Socalmuscle

Socalmuscle

    Hammer Bro.

  • Members
  • 1,677 posts

Posted 03 December 2012 - 10:19 PM

Article kinda sucks.

Takes forever explaining the abbreviations of CPU, gpu, mcm, etc and offers zero new info.

The author also doesn't understand the benefits of the wii u CPU architecture and the clock speed relationship. Makes the erroneous mistake that clock speed = production.

Also makes the mistake of saying the graphics have to be downscaled in order for gpgpu to perform CPU assists.

Also makes the mistake of thinking gta would tax the system.

The article simply repeats some Keats asks info and Wikipedia info and then does what anyone else does: forms an opinion.

I'll save you the read.

Short summary:

"wii u has a CPU, gpu, and ram. It has specs. But it's not next gen. Because I don't know how to interpret anything regardless of not knowing the specs."

Don't waste your time.

#6 Thugbrown

Thugbrown

    Spear Guy

  • Members
  • 98 posts

Posted 04 December 2012 - 11:32 AM

This is just another case of when an article says somthing that somebody doesn't want to hear they immedeatly wright it of as erroneous when all the author is doing is breaking down how the system operates differently and debunking the low clock speed rumor. Nowhere in the article did the author say the Wii U is not next gen, it's different just as the Cell Processor was different and when utilized to it's full potentail we are going to get some groundbreaking content. Just because he said or suggested GTA 5 could run smoother if scaled down to 720P on the Wii U don't get mad because I'm sure that the PS360 will have some issues trying to run GTA 5 at 720P or even 1080P. The Wii U is more than capable the author quotes:

"Wii U is expertly designed to make a somewhat unimpressive CPU (individual cores) run at impressive speeds. Utilizing tri-core architecture greatly increases the overall clock speed of the CPU already, but placing the GPGPU and its “substantial” amount of eDRAM (which can both boost the clock speeds of the CPU) on a multi-chip module increases their combined efficiency even greater. This also means that the Wii U’s architecture holds numerous ways to maximise its performance when put into the right developers’ hands. It’s likely we’ll see jaw dropping HD titles that run in native 1080p, taking full advantage of the GPU (and it’s eDRAM), but offers a gameplay experience that’s less taxing on the CPU – think thatgamecompany’s Journey on Sony’s PS3."

Thanks to the ones that did read it and like it to the others please just read the article, don't skim it becuase you might miss the whole point.

Edited by Thugbrown, 04 December 2012 - 11:36 AM.


#7 hubang

hubang

    Red Koopa Troopa

  • Members
  • 68 posts

Posted 04 December 2012 - 11:48 AM

Thanks for the link! That was a really informative article and the comments below it were thought-provoking as well.

I'm glad it touched on the huge heatsink and potential MCM heat issues. I've noticed my Wii U seems cooler after a few hours of running than my PS3 (which is a second gen one replacement for my first gen YLOD).
Nintendo Network ID: hubang

#8 SoldMyWiiUAndLeftTheForums

SoldMyWiiUAndLeftTheForums

    Pokémon Trainer

  • Members
  • 4,168 posts

Posted 04 December 2012 - 12:03 PM

That was an awesome read, thank's op.

#9 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 04 December 2012 - 12:18 PM

This is just another case of when an article says somthing that somebody doesn't want to hear they immedeatly wright it of as erroneous when all the author is doing is breaking down how the system operates differently and debunking the low clock speed rumor. Nowhere in the article did the author say the Wii U is not next gen, it's different just as the Cell Processor was different and when utilized to it's full potentail we are going to get some groundbreaking content. Just because he said or suggested GTA 5 could run smoother if scaled down to 720P on the Wii Uto don't get mad because I'm sure that the PS360 will have some issues trying to run GTA 5 at 720P or even 1080P. The Wii U is more than capable the author quotes:
"Wii U is expertly designed to make a somewhat unimpressive CPU (individual cores) run at impressive speeds. Utilizing tri-core architecture greatly increases the overall clock speed of the CPU already, but placing the GPGPU and its “substantial” amount of eDRAM (which can both boost the clock speeds of the CPU) on a multi-chip module increases their combined efficiency even greater. This also means that the Wii U’s architecture holds numerous ways to maximise its performance when put into the right developers’ hands. It’s likely we’ll see jaw dropping HD titles that run in native 1080p, taking full advantage of the GPU (and it’s eDRAM), but offers a gameplay experience that’s less taxing on the CPU – think thatgamecompany’s Journey on Sony’s PS3."
Thanks to the ones that did read it and like it to the others please just read the article, don't skim it becuase you might miss the whole point.


Ooh, ugh, is this from the article?
I read the first couple bolded statements and its all so wrong its cringeworthy.

Simultaneous multiprocessing does not raise clock speeds.... edram does not boost clock speeds....This is... all nonsense.

Im sorry dude this guy is clueless.

Edited by 3Dude, 04 December 2012 - 12:20 PM.

banner1_zpsb47e46d2.png

 


#10 Thugbrown

Thugbrown

    Spear Guy

  • Members
  • 98 posts

Posted 04 December 2012 - 12:27 PM

^ ^
I not even going to argue with you cause thats all you do. Thanks for the read though. ;)

Edited by Thugbrown, 04 December 2012 - 12:28 PM.


#11 Scumbag

Scumbag

    Pokey

  • Members
  • 1,177 posts

Posted 04 December 2012 - 01:26 PM

Ooh, ugh, is this from the article?
I read the first couple bolded statements and its all so wrong its cringeworthy.

Simultaneous multiprocessing does not raise clock speeds.... edram does not boost clock speeds....This is... all nonsense.

Im sorry dude this guy is clueless.


GET THE HELL OUT. Everything in that article was correct, it was trying to explain the Wii U's strengths, new technology and tricks to dumb ignorant clowns such as yourself. The fact you still don't understand makes me wonder if some people are naturally dumb....

Can your parents not afford a Wii U this Xmas? Go play your underpowered Xbox 360 with the rest of the children.

Edited by Forza Juventus, 04 December 2012 - 01:27 PM.


#12 MorbidGod

MorbidGod

    Hammer Bro.

  • Members
  • 1,717 posts

Posted 04 December 2012 - 01:58 PM

GET THE HELL OUT. Everything in that article was correct, it was trying to explain the Wii U's strengths, new technology and tricks to dumb ignorant clowns such as yourself. The fact you still don't understand makes me wonder if some people are naturally dumb....

Can your parents not afford a Wii U this Xmas? Go play your underpowered Xbox 360 with the rest of the children.


He usually is pro Wii U ...
Whovian12 -- Nintendo Network ID.

#13 The Lonely Koopa

The Lonely Koopa

    Chain Chomp

  • Members
  • 617 posts
  • Fandom:
    Reggie's Body, Nintendo being doomed

Posted 04 December 2012 - 02:09 PM

GET THE HELL OUT. Everything in that article was correct, it was trying to explain the Wii U's strengths, new technology and tricks to dumb ignorant clowns such as yourself. The fact you still don't understand makes me wonder if some people are naturally dumb....

Can your parents not afford a Wii U this Xmas? Go play your underpowered Xbox 360 with the rest of the children.

Um instead of insulting 3dude try to prove him wrong since you seem to have knowledge on this matter ,

#14 MorbidGod

MorbidGod

    Hammer Bro.

  • Members
  • 1,717 posts

Posted 04 December 2012 - 02:12 PM

Ooh, ugh, is this from the article?
I read the first couple bolded statements and its all so wrong its cringeworthy.

Simultaneous multiprocessing does not raise clock speeds.... edram does not boost clock speeds....This is... all nonsense.

Im sorry dude this guy is clueless.


To clarify, having multiple chore does not increase turkey speed, but does mean the system can do things faster by using multitasking and bring able to do mute than a faster clocked one core CPU. Having more edram doesn't make the clock speeds faster, but does mean the cpu can work faster. And, if i am not mistaken, it also allows the cpu to process more info quicker. That part, i could be wrong though.

Anyway, you are right, but that isn't actually what the author was meaning.
Whovian12 -- Nintendo Network ID.

#15 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 04 December 2012 - 02:36 PM

To clarify, having multiple chore does not increase turkey speed, but does mean the system can do things faster by using multitasking and bring able to do mute than a faster clocked one core CPU. Having more edram doesn't make the clock speeds faster, but does mean the cpu can work faster. And, if i am not mistaken, it also allows the cpu to process more info quicker. That part, i could be wrong though.

Anyway, you are right, but that isn't actually what the author was meaning.


Yeah, the things this guy tries talking about increases instructions per clock, not clocks per second.

Edited by 3Dude, 04 December 2012 - 02:37 PM.

banner1_zpsb47e46d2.png

 


#16 Socalmuscle

Socalmuscle

    Hammer Bro.

  • Members
  • 1,677 posts

Posted 04 December 2012 - 02:36 PM

Um instead of insulting 3dude try to prove him wrong since you seem to have knowledge on this matter ,


That's the problem.

They aren't going to win an argument with him. Because he is right. He is also the rare individual that know his stuff, backs it up, and changes his tune if and when the facts are different. The board is better off with him as a part of it.

And the "article" (blog?) is so off base and remedial, it seems like a child wrote it.

It is basically a conglomeration of some facts and a lot of opinion (and weird theory) .

Not to be taken with any seriousness at all.

Sorry to break it to everyone who likes it. but liking something doesn't make it true.

Um instead of insulting 3dude try to prove him wrong since you seem to have knowledge on this matter ,


That's the problem.

They aren't going to win an argument with him. Because he is right.

And the "article" (blog?) is so off base and remedial, it seems like a child wrote it.

It is basically a conglomeration of some facts and a lot of opinion (and weird theory) .

Not to be taken with any seriousness at all.

Sorry to break it to everyone who likes it. but liking something doesn't make it true.

Edited by Socalmuscle, 04 December 2012 - 02:38 PM.


#17 Alex Atkin UK

Alex Atkin UK

    Boo

  • Members
  • 528 posts

Posted 04 December 2012 - 02:52 PM

Clock speed is exactly that, how quickly the instructions can be passed through the CPU/GPU. Adding RAM, more cores, etc, doesn't alter the clock speed at all.

What you CAN do however is increase how much can actually be achieved in a single tick of that clock.

Its basically the same principle as having two people working together to get more work done. You won't get the job done twice as fast as some delays are caused by coordination between them, or they might both want to use the tool at the same time, so one will have to wait for the other to finish. But it DOES get the job done quicker.

Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK

 

How to improve the Wii U download speed.


#18 Medu

Medu

    Green Koopa Troopa

  • Members
  • 47 posts

Posted 04 December 2012 - 03:14 PM

Nothing interesting in there, and lots of wrong info. CPU/GPU are not on the same die, he is confusing SRAM with DRAM, he keeps stating random things increase the clockspeed of the CPU when what they really do is make the CPU more efficient, or take some of the burden away(GPGPU).

#19 hubang

hubang

    Red Koopa Troopa

  • Members
  • 68 posts

Posted 04 December 2012 - 03:18 PM

To clarify, having multiple chore does not increase turkey speed, but does mean the system can do things faster by using multitasking and bring able to do mute than a faster clocked one core CPU. Having more edram doesn't make the clock speeds faster, but does mean the cpu can work faster. And, if i am not mistaken, it also allows the cpu to process more info quicker. That part, i could be wrong though.

Anyway, you are right, but that isn't actually what the author was meaning.


Sorry but I had to! :P
Nintendo Network ID: hubang

#20 Socalmuscle

Socalmuscle

    Hammer Bro.

  • Members
  • 1,677 posts

Posted 04 December 2012 - 04:06 PM

Clock speed is exactly that, how quickly the instructions can be passed through the CPU/GPU. Adding RAM, more cores, etc, doesn't alter the clock speed at all.

What you CAN do however is increase how much can actually be achieved in a single tick of that clock.

Its basically the same principle as having two people working together to get more work done. You won't get the job done twice as fast as some delays are caused by coordination between them, or they might both want to use the tool at the same time, so one will have to wait for the other to finish. But it DOES get the job done quicker.


...

Edited by Socalmuscle, 04 December 2012 - 04:11 PM.





1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users

Anti-Spam Bots!