Jump to content


Photo

"[The] Wii U Has A Horrible, Slow CPU" Comment Was Based On "Very Early Kits"

Comment CPU Dev Early Horrible Kit Slow THQ

  • Please log in to reply
31 replies to this topic

#21 Socalmuscle

Socalmuscle

    Hammer Bro.

  • Members
  • 1,677 posts

Posted 16 December 2012 - 10:08 PM

come on guys we have heard these rumours to many times by so many people there has to b truth in it. weather its a little bit more powerful or on par with ps360. i think it is a little more powerful and im thinking cause its new tech they might be able to optimize it abit more but if its on par were in trouble cause evrything has been squeezed out of the current gen already which means basically wat ur seeing now is wat ur gunna get


Actually, the Wii U CPU is easily more powerful than the 360/PS3 for what it is designed to do. The developers who want to simply plop over old code onto the Wii U are going to hate it. It isn't a clock speed dependent architecture. They COULD get a far superior engine going if they were to dedicate the resources to the Wii U. but that defeats the purpose of porting for easy money.

Games built from the ground up for Wii U will be next gen, visually. The GPU inside the Wii U eats the PS3 and 360 GPUs, poops them out, and makes them eat that poop. it is quite literally no contest there.

the Wii U needs games to be coded for... the Wii U. Not xbox code. Not PS3.

It is happening and we will be seeing the fruit of these labors next year. Probably third quarter.

The Wii U hasn't begun to show its potential. 2D games, Pikmin, and old gen ports arent' going to cut it.

Enough repeating this "on par" business. that is a total fallacy. Even what has been publicly revealed about the hardware (official and unofficial) are enough to know that the next gen is already here. Even if the games that do it justice aren't ready yet.

#22 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 16 December 2012 - 11:01 PM

You are been confused by terminology. Here is a decent blog posting that is worth a read.http://perilsofparal...rscalar-vs.html
Which means it can finish 2 instructions per clock, which is what I was saying.
I cannot be sure what IBM mean by this but it's probably just the stages in the pipeline. Not even an ivybridge core can complete 6 instructions per clock.
Again this will be the instructions at the pipeline level. There is nothing special about that for a RISC chip. A CISC chip might take more than 1 clock to execute a single pipeline stage.
At a more basic level all you have to do is take a look at the CPU die size, and the systems power usage, and you will know it's not a very powerful CPU. It's a very cost/power usage efficient CPU but not all that powerful. The PS360 CPU's weren't very efficient but they massive were brute force monsters. When released the Xbox360 had a CPU of 176mm2 and a GPU of 182mm2(without eDram). The Wii U has a 32mm2 CPU and a 150mm2 GPU(alothough that probably includes a lot of eDram). This shows that Nintendo have put a lot more resources into the GPU than CPU which is fairly stand in this day and age but most games will be designed for CPU's with more power than the Wii U mainly because the PS360 generation had more powerful CPU's than they really should of had(most games run better on the Xbox due to it's better GPU- PS3 was way too CPU focused).


Do NOT respond to me again and do NOT post a link to something you claim i dont understand when you have no clue what you are linking to. You have posted NOTHING that doesnt concrete what i have already said.

Fetching takes an instruction out of the instruction cache (which is massive in the wii u cpu) to be put in a stack to be executed, execute, EXECUTES THE INSTRUCTION, as long as this instruction remains in the instruction que it can be used over and over again with ridiculous speed.

Retire REMOVES an instruction from the que that is no longer needed to make room for something else.

Ivy bridge doesnt get six instructions per clock because ivybridge having 14 stage pipes, is piped deeper than the 750's 4 stages, so it can be clocked high.

This is also the reason why you will NEVER see a 2Ghz 750 until the process size reaches like, 18 nm.

There is no architecture in existance that can match the 750 at ipc at its power draw. The problem with the 750 is it cant be clocked higher.

In case you STILL havent figured it out there is a correlation between instruction pipeline stages, ipc, and clock frequency.

banner1_zpsb47e46d2.png

 


#23 PedanticGamer

PedanticGamer

    Bullet Bill

  • Members
  • 370 posts

Posted 17 December 2012 - 12:19 AM

I disagree with u, that xbox runs better than ps3, because it has better gpu... All ports from 360 to the PC, ALL are cpu limited!!!!!!!!!! And its a pain in the A#$# .. Never seen a game port from 360 that has any gpu instructions on it... Thats why we had so many problems with the pc versions! Our sli and cf was more like pointless..

Even assasins creed 3, the latest.. is cpu limited!!! And because xbox runs with 1 core - 2 threads for the games... If you have a quad core or a duo core cpu in the pc, it has TOTALLY the same cpu ussage!! Skyrim took about 6 - 8 months, till bethesda fix it for the PC. It was CPU limited and u could see a GTX580 with 10 fps in some areas!! lol

I wonder, some developers mock wii U because is GPU centric... What will happen when the new ones come out (ps4 and 720).. they will have to re-programm their engines.


I personally never had Skyrim go below 60fps since release with everything maxed out.

#24 GAMER1984

GAMER1984

    Lakitu

  • Members
  • 2,036 posts
  • NNID:gamer1984
  • Fandom:
    Nintendo

Posted 17 December 2012 - 04:35 AM

i know this is a thread about the CPU but i have to say this. at this point why would anyone speculate the GPU in Wii U isnt anything other than a modified E6760? As far as what nintendo would be looking to accomplish with the Wii U (modern achitecture DX11 equivalent GPU, modern shader support, efficent and low wattage) "NO" other GPU fits the bill. pair that with the suppose email confirmations from AMD confirming if you believe or not and i dont see another GPU that fits the PUZZLE as well.

Edited by GAMER1984, 17 December 2012 - 04:36 AM.


#25 Plutonas

Plutonas

    Pokey

  • Members
  • 1,319 posts

Posted 17 December 2012 - 05:11 AM

I personally never had Skyrim go below 60fps since release with everything maxed out.


I wrotte about specific areas (cells in the word)... Not the entire game.. and its cpu limited, if you edit the *.ini files, you can see that by defult is for 2 cpu cores... if you want your gpu to do some extra work, you must address it inside the *.ini file.. I love to edit and mod my games...

but now skyrim is smooth, there is no need.

Edited by Orion, 17 December 2012 - 05:15 AM.


#26 Medu

Medu

    Green Koopa Troopa

  • Members
  • 47 posts

Posted 17 December 2012 - 11:32 AM

Do NOT respond to me again and do NOT post a link to something you claim i dont understand when you have no clue what you are linking to. You have posted NOTHING that doesnt concrete what i have already said.

Fetching takes an instruction out of the instruction cache (which is massive in the wii u cpu) to be put in a stack to be executed, execute, EXECUTES THE INSTRUCTION, as long as this instruction remains in the instruction que it can be used over and over again with ridiculous speed.

Retire REMOVES an instruction from the que that is no longer needed to make room for something else.

Ivy bridge doesnt get six instructions per clock because ivybridge having 14 stage pipes, is piped deeper than the 750's 4 stages, so it can be clocked high.

This is also the reason why you will NEVER see a 2Ghz 750 until the process size reaches like, 18 nm.

There is no architecture in existance that can match the 750 at ipc at its power draw. The problem with the 750 is it cant be clocked higher.

In case you STILL havent figured it out there is a correlation between instruction pipeline stages, ipc, and clock frequency.


Odd how you have changed your tune. A few months ago there was no question in your mind but that the Wii U had a Power7 CPU and there was no way it would have a 3 core broadway based CPU- what has changed?(It's all there in the history). I will no longer argue technical points but younger members on the board should know to take everything you say with a unhealthy amount of salt.

#27 Desert Punk

Desert Punk

    Chain Chomp

  • Members
  • 656 posts

Posted 17 December 2012 - 01:05 PM

Odd how you have changed your tune. A few months ago there was no question in your mind but that the Wii U had a Power7 CPU and there was no way it would have a 3 core broadway based CPU- what has changed?(It's all there in the history). I will no longer argue technical points but younger members on the board should know to take everything you say with a unhealthy amount of salt.


You are wasting your time responding to 3dude. I've had similar debates but I think the reality is this forum is mainly read by real Nintendo fanboys which isn't a bad thing but when it comes to the specification realism isn't desired by many here. We already have a wealth of evidence about the specification of the wii u based on how the multiformat games run and what has been released by developers or hackers with regard spec. Admittedly we don't have full details of gpu but hopefully that information will come. Dealing with 3dude is like visiting an asylum and trying to convince one of the inmates he is not napolean who believes he is. It is completely pointless and serves no purpose.

#28 Bunkei

Bunkei

    Red Koopa Troopa

  • Members
  • 55 posts

Posted 17 December 2012 - 02:58 PM

Odd how you have changed your tune. A few months ago there was no question in your mind but that the Wii U had a Power7 CPU and there was no way it would have a 3 core broadway based CPU- what has changed?(It's all there in the history). I will no longer argue technical points but younger members on the board should know to take everything you say with a unhealthy amount of salt.


I believe one of the dev kits did feature a Power 7 CPU but for some reason, Nintendo opted for the upgraded Broadway architecture. This would explain why IBM had to issue a "correction" because the engineers who posted the tweet probably weren't aware that the final hardware specs have changed, especially in regards to the CPU.

Don't blame 3Dude for that, as many of us believed the same based on the information given.

Also I wonder if perhaps backwards compatability played a part .. Could the Wii U Power 7 be fully compatible with Wii games and peripherals? That's for hardware experts to answer ..

Edited by Bunkei, 17 December 2012 - 02:59 PM.


#29 Medu

Medu

    Green Koopa Troopa

  • Members
  • 47 posts

Posted 17 December 2012 - 03:51 PM

I believe one of the dev kits did feature a Power 7 CPU but for some reason, Nintendo opted for the upgraded Broadway architecture. This would explain why IBM had to issue a "correction" because the engineers who posted the tweet probably weren't aware that the final hardware specs have changed, especially in regards to the CPU.

Don't blame 3Dude for that, as many of us believed the same based on the information given.

Also I wonder if perhaps backwards compatability played a part .. Could the Wii U Power 7 be fully compatible with Wii games and peripherals? That's for hardware experts to answer ..


A company cannot change the CPU with a few months notice, it would require a complete redesign of the mainboard, the dev tools etc.

#30 Alex Atkin UK

Alex Atkin UK

    Boo

  • Members
  • 528 posts

Posted 17 December 2012 - 05:20 PM

That would entirely depend on the pin compatibility of the CPUs, were they similar enough? Also as the CPU and GPU are effectively on their own mini-daughter board it may only have required a redesign of that chip rather than the whole mainboard.

Also at the time, was the devkit based purely on Wii U hardware are was it an adapted PC with Wii U expansion cards for testing purposes? There are many things we do not know that could indeed allow switching CPU architecture much more simply than you describe.

It doesn't hurt to keep and open mind.

Edited by Alex Atkin UK, 17 December 2012 - 05:30 PM.

Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK

 

How to improve the Wii U download speed.


#31 GAMER1984

GAMER1984

    Lakitu

  • Members
  • 2,036 posts
  • NNID:gamer1984
  • Fandom:
    Nintendo

Posted 17 December 2012 - 07:02 PM

well its funny we are talking dev kits now. i remember almost every rumor stating early on quad core power architecture cpu clocked around 3ghz. also all early rumors suggested the console being 50% more powerful than ps360. can someone break down what that means? because Iwata also said when you look at games like assassin creed 3 running on wii U its only using about half of the power or you are seeing half of what it can do. can we look into these quotes what does half or 50% more powerful really means?

Edited by GAMER1984, 17 December 2012 - 07:04 PM.


#32 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 19 December 2012 - 02:08 PM

Odd how you have changed your tune. A few months ago there was no question in your mind but that the Wii U had a Power7 CPU and there was no way it would have a 3 core broadway based CPU- what has changed?(It's all there in the history). I will no longer argue technical points but younger members on the board should know to take everything you say with a unhealthy amount of salt.


What changed is we opened it up and found a binary compatable 750.

4 fetches 6 executions, 2 retires, 4 stage pipeline +3 for simd, zero penalty branch miss protection, paired single simd.

Its a 750 no way around it. Nothing else comes close. The question is WHAT 750. IBM publically terminated the series in 2006, because of its inability to get the chip multicore and/or clocked higher. Pressure was high from apple for a successor to the excellent g3 (ppc 750's commercial computer cpu, the powermac g3). So IBM abandoned the family at the last minute for a deeply piped architecture that could reach clock speeds up to 1 Ghz and beyond and support smp to meet apples demands.

The chip was garbage, had horrible yields, lousy gp with only fairly strong fp saving it was a public disaster for apple.

Ibm would later release an improved g3 cpu (the 750), which, at a single core at 750mhz outperformed dual core g4's in excess of 1Ghz. Apple was not happy.

IBM's last chance to keep apple was in 2004/5. Still unable to gain smp support or clock speeds with the 750 family for the g5, IBM publically announced the family was a dead end and shelved it. They went with die shrink and general fixer upper of the g4 clocked up to 3 Ghz. It suffered all the same problems as the g4. Apple ended the contract with ibm over the g5 in 2006, in favour of a contract with intel, resulting in the icore line.

xenon and cell are both modified g5's.

Anyone who says broadway ANYTHING without bringing these facts up is simply throwing crap against the wall.

Sometimes crap sticks.

The problem with guessing a 750 with the facts we had is, its doing 2 impossible things for the 750 family. It has smp support, which no 750 ever had ever, its inability to support multicore was one of the reasons the series was terminated in 2006 (another reason it 'couldnt' have been a 750, the series was publically terminated in 2006, and never made on a process smaller than 90nm) the other reason is because the family couldnt be clocked higher than 1ghz, at a time when 3-4 Ghz was what was demanded. Wii u cpu is clocked at 1.25 Ghz, higher than any 750 in existance... The reason why is because its fabbed on a 45nm process, which is 'impossible' because the family was publically terminated before it ever got below 90nm.

The fact that ibm was still attempting to do something with its last good architecture it publically said it terminated wasnt known until we found out it was still making specialty 750's for select clients, when one such client announced they needed to take one to mars. Now we know there are years of undocumented entries into the 750 line.

banner1_zpsb47e46d2.png

 






Also tagged with one or more of these keywords: Comment, CPU, Dev, Early, Horrible, Kit, Slow, THQ

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

Anti-Spam Bots!