Jump to content


Medu

Member Since 04 Dec 2011
Offline Last Active Dec 17 2012 03:48 PM

Posts I've Made

In Topic: "[The] Wii U Has A Horrible, Slow CPU" Comment Was Based On "Very...

17 December 2012 - 03:51 PM

I believe one of the dev kits did feature a Power 7 CPU but for some reason, Nintendo opted for the upgraded Broadway architecture. This would explain why IBM had to issue a "correction" because the engineers who posted the tweet probably weren't aware that the final hardware specs have changed, especially in regards to the CPU.

Don't blame 3Dude for that, as many of us believed the same based on the information given.

Also I wonder if perhaps backwards compatability played a part .. Could the Wii U Power 7 be fully compatible with Wii games and peripherals? That's for hardware experts to answer ..


A company cannot change the CPU with a few months notice, it would require a complete redesign of the mainboard, the dev tools etc.

In Topic: "[The] Wii U Has A Horrible, Slow CPU" Comment Was Based On "Very...

17 December 2012 - 11:32 AM

Do NOT respond to me again and do NOT post a link to something you claim i dont understand when you have no clue what you are linking to. You have posted NOTHING that doesnt concrete what i have already said.

Fetching takes an instruction out of the instruction cache (which is massive in the wii u cpu) to be put in a stack to be executed, execute, EXECUTES THE INSTRUCTION, as long as this instruction remains in the instruction que it can be used over and over again with ridiculous speed.

Retire REMOVES an instruction from the que that is no longer needed to make room for something else.

Ivy bridge doesnt get six instructions per clock because ivybridge having 14 stage pipes, is piped deeper than the 750's 4 stages, so it can be clocked high.

This is also the reason why you will NEVER see a 2Ghz 750 until the process size reaches like, 18 nm.

There is no architecture in existance that can match the 750 at ipc at its power draw. The problem with the 750 is it cant be clocked higher.

In case you STILL havent figured it out there is a correlation between instruction pipeline stages, ipc, and clock frequency.


Odd how you have changed your tune. A few months ago there was no question in your mind but that the Wii U had a Power7 CPU and there was no way it would have a 3 core broadway based CPU- what has changed?(It's all there in the history). I will no longer argue technical points but younger members on the board should know to take everything you say with a unhealthy amount of salt.

In Topic: "[The] Wii U Has A Horrible, Slow CPU" Comment Was Based On "Very...

16 December 2012 - 10:34 AM

heh heh... ha... ha ha HA HA HA HA HA!!!!!

Broadway retires 2 instructions a clock. fetches 4 from the instruction cache, and executes six. Oh, and most instructions are finished in ONE clock.

You are right, wii u may not do 18 ipc. Thats what an un upgraded broadway would do as a tricore. If anything espresso would be capable of more.

Why dont you follow your own advice, AND RETIRE YOUR MADE UP FICTION YOU PULLED OUT OF YOUR BUTT, and replace it with the REAL NUMBERS from the official broadway documentation ive actually ATTACHED to this board numerous times.


So what does the OFFICIAL documentation say about instructions executed?

No way six?

What have we learned?


You are been confused by terminology. Here is a decent blog posting that is worth a read.http://perilsofparal...rscalar-vs.html

Broadway retires 2 instructions a clock.


Which means it can finish 2 instructions per clock, which is what I was saying.

and executes six.


I cannot be sure what IBM mean by this but it's probably just the stages in the pipeline. Not even an ivybridge core can complete 6 instructions per clock.

Oh, and most instructions are finished in ONE clock.


Again this will be the instructions at the pipeline level. There is nothing special about that for a RISC chip. A CISC chip might take more than 1 clock to execute a single pipeline stage.

At a more basic level all you have to do is take a look at the CPU die size, and the systems power usage, and you will know it's not a very powerful CPU. It's a very cost/power usage efficient CPU but not all that powerful. The PS360 CPU's weren't very efficient but they massive were brute force monsters. When released the Xbox360 had a CPU of 176mm2 and a GPU of 182mm2(without eDram). The Wii U has a 32mm2 CPU and a 150mm2 GPU(alothough that probably includes a lot of eDram). This shows that Nintendo have put a lot more resources into the GPU than CPU which is fairly stand in this day and age but most games will be designed for CPU's with more power than the Wii U mainly because the PS360 generation had more powerful CPU's than they really should of had(most games run better on the Xbox due to it's better GPU- PS3 was way too CPU focused).

In Topic: "[The] Wii U Has A Horrible, Slow CPU" Comment Was Based On "Very...

13 December 2012 - 02:38 PM

The wii u cpu rapes xenon, eats it, and craps it out brutally in every single category except one.

Xenon was a horrible horrible cpu.

One wii u core can execute as many instructions in a cycle as the ENTIRE 360 cpu.

360 can execute 1 instruction per thread, it has 2 threads a core, and 3 cores, thats six instructions a clock.

wii u gets six instructions a core (including 2 integers) and three cores for 18 instructions a
clock.

Thats not even getting into how short the wii u instriction pipelines are, and how stupid long xenons are, how awesome wii u's branch prediction is and how crap xenons is, How every single missed branch is a 500 cycle penalty, while wii u has a zero penalty 1st miss and a six cycle miss afterwards.


Can we stop with the fiction. The Wii U does not do 18 IPC, it does a bit more than 2. The Xenon does about 2 making it in theory about twice as fast as the Wii. However the Wii U CPU will get much closer to it's maximum potential due to been OoOE- common figure is about 1/3 faster- making the Wii U CPU about 2/3 as fast as the Xbox CPU.

In Topic: More Wii U hardware talk the breakdown (article)

04 December 2012 - 03:14 PM

Nothing interesting in there, and lots of wrong info. CPU/GPU are not on the same die, he is confusing SRAM with DRAM, he keeps stating random things increase the clockspeed of the CPU when what they really do is make the CPU more efficient, or take some of the burden away(GPGPU).

Anti-Spam Bots!