Wii U could more than double power consumption?
#1
Posted 28 December 2012 - 09:19 AM
POPULAR
Nintendo say the power consumption of just the Wii U console is 75W. I read on Eurogamer and the hacker's tear down that WIi U has a 33W maximum when playing games, that is very impressive considering the 360 and PS3 were over 200W when just launched and even now after numerous design changes the 360/PS3 use more than double 33W.
So if the WIi U is using 33W on games at the moment could Nintendo be de-clocking the Wii U on purpose and then possibly unleash the Wii U's real processing power and fast disc read speeds for later games? That would explain the 75W....
- Stephen, Dragon, SoldMyWiiUAndLeftTheForums and 2 others like this
#2
Posted 28 December 2012 - 09:27 AM
#3
Posted 28 December 2012 - 09:57 AM
#5
Posted 28 December 2012 - 10:44 AM
#7
Posted 28 December 2012 - 11:46 AM
at 6xxx series, we see gpus at 75watts with 800 gflops.. 640 spus or something similar (in a pc)... So if wii U gpu is 6xxx series, then its a good thing compared with an rv7xx series that consumes lots of power. And its more likely a 6xxx series, because an rv7xx chipset at 30-40 watts.. = weak
A question of mine is, is it 33 or 75 at idle? or at peak. If its 75 at peak... then its not a good thing.. lol... if its at idle , forget for what I said above.
after edit:
I read at neogaf, that a modern modified chip can give at 25wats 700 gflops.. so there is no problem, if nintendo achieved something similar.. even 450-500 gflops is really good.. more like 3x from currents
Edited by Orion, 28 December 2012 - 12:31 PM.
#8
Posted 28 December 2012 - 12:15 PM
#9
Posted 28 December 2012 - 12:45 PM
As has been said else where, the Wii U is much more efficient with its processors than the other consoles. I wait to see its true potential.
#10
Posted 28 December 2012 - 12:50 PM
- Mitch likes this
Warning: Cannot modify header information - headers already sent by (output started at /home/thewiiu/public_html/ips_kernel/HTMLPurifier/HTMLPurifier/DefinitionCache/Serializer.php:133) in /home/thewiiu/public_html/ips_kernel/classAjax.php on line 328
{"success":1,"post":"\n\n
\n\t\t<\/a>\n\t\t\n\t\n\t\t\n\t\t
\n\t\t\t\n\t\t\t\t
#11
Posted 28 December 2012 - 12:51 PM
.not to menchian the wiiU is usualy cold after playing for a while, alsomeaning its not likly pushed to the max. if teh underclocked featuer isnt overrideable in any way then it will very likly hold the consul back
#12
Posted 28 December 2012 - 01:56 PM
Well I for one appreciate the lower watt usage. The PS3 eats power and with that makes a hell of a racket. Only surpassed by the Xbox which is almost unbearable. The Wii U is nice and quiet which I like.
As has been said else where, the Wii U is much more efficient with its processors than the other consoles. I wait to see its true potential.
A lot of people say that, have you all got the older models or something? coz my slim 360 (you know..the shiny black one) is whisper quiet when im playing it?! my PC on the other hand has 3 x 120mm case fans plus the cpu fan and graphics card fans and psu fan which when they all get going is almost distractingly noisey, not hugely loud but very whirry (yeah thats a word!).
#13
Posted 28 December 2012 - 02:53 PM
nintendo said they intentinaly underclocked their CPU. made me wonder if there is a override code or similar to put more power out of its CPU.
.not to menchian the wiiU is usualy cold after playing for a while, alsomeaning its not likly pushed to the max. if teh underclocked featuer isnt overrideable in any way then it will very likly hold the consul back
Nintendo did not say that anywhere, ever. Espresso is the highest clocked 750 ever released at 1.25 Ghz.
with a 4 stage instruction pipeline, it CANT be clocked much higher.
The psu is rated at 75 watts. Thats whats drawn from the wall, not all of that can be used, some of it is lost as thermal waste.
So we are looking at, at most, 60 watts, that goes to the wifi, both internet and gamepad, the disc drive, all the usb slots (although, it appears they are unpowered, better have a seperate power source for your hard drive, or hope a y cable is enough) flash mem, lights, ram, the audio dsp, the arm co/security processor... And the cpu and gpu.
That being said, all that does indeed appear to only take 33 watts. The most likely explanation, of course, is that the system simply isnt under load.
That doesnt mean the cpu can be clocked higher, it means it can do more instructions per cycle than its being given.
- Kokirii likes this
#14
Posted 28 December 2012 - 03:14 PM
What is even more curious though is that the AC rating on the UK Wii U PSU is around 207W which seems way too high for a 15V 75W PSU.
I wonder if its labelled incorrectly as looking on Google I see universal PSUs rated as 100-240V 1.5A which makes more sense, drawing around 90W at the mains at 100V.
Back to CPU speed though, nobody seems to have considered exactly WHERE the hackers got the numbers from? Does he know for sure those are the highest clock rate, the clock rate at a specific point in the OS, a specific game, etc? There is nothing to suggest the Wii U cannot alter its clock rate according to what software you are a running, so the fact a hacker managed to read the clock speed during an unknown event, still tells us nothing about the maximum clock rate the hardware is designed for.
Bottom line though, as you point out, the Wii U clearly is not being used efficiently at this point. If its only using around 1/3 of its maximum wattage, it doesn't matter if its due to underclocking or just due to inefficient programming. Either way it implies the console is capable of much more than we have seen so far. The question is if that disused power is from the CPU, GPU, or a little of both.
Edited by Alex Atkin UK, 28 December 2012 - 04:04 PM.
Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK
#15
Posted 28 December 2012 - 03:22 PM
- Kiki Neko-Chan likes this
#16
Posted 28 December 2012 - 03:54 PM
Just because its there, shouldn't make it mandatory. Not all PS3 games use the motion controls, not all PS Vita games use the rear touch pad. Heck, they might as well complain that they couldn't find a use for the camera so didn't bother porting, its just as illogical.
Of course the reality is, these are excuses. If they saw a big market for their games on Wii U, they would port them. The problem is that there is little motivation to release on Wii U as its a tiny market compared to PS3/Xbox. Nintendo is just not seen as a big player in mass market gaming anymore and so its going to be dragging along for some time. Although once the next-gen PSN/Xbox comes along, Wii U will suddenly be the current-gen with the bigger install base out of the bunch - so it will be interesting to see what happens.
Its always a difficult time though when new consoles come out and many developers are clinging onto the previous generation until sales start to drop. I can imagine the other consoles having exactly the same problem as Wii U right now, its going to take something big to convince most people to jump onto a new console. Because as long a people continue buying on current-gen, there is little motivation to spend the money porting to newer hardware.
What with developers going bankrupt left and right, its easy to understand the reluctance to gamble on smaller markets.
Edited by Alex Atkin UK, 28 December 2012 - 04:09 PM.
Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK
#17
Posted 28 December 2012 - 04:02 PM
Not true, the Wii U PSU is rated as 15V 75W OUTPUT, so the conversion loss after that point could be very low. I would assume Nintendo chose 15V DC for a reason.
What is even more curious is that the AC rating on the UK Wii U PSU is around 207W which seems way too high for a 15V 75W PSU.
I wonder if its labelled incorrectly as looking on Google I see universal PSUs rated as 100-240V 1.5A which makes more sense, drawing around 90W at the mains at 100V.
Back to CPU speed though, nobody seems to have considered exactly WHERE the hackers got the numbers from? Does he know for sure those are the highest clock rate, the clock rate at a specific point in the OS, a specific game, etc? There is nothing to suggest the Wii U cannot alter its clock rate according to what software you are a running, so the fact a hacker managed to read the clock speed during an unknown event, still tells us nothing about the maximum clock rate the hardware is designed for.
No, it does not work like that. Yes, it says 15 volts and five amps, yes that technically comes out to 75 watts output, but NO, that doesnt all make it to the system.
I was being VERY generous with conversion loss, as effeciency averages around 75%, which would get 56 watts.
I am getting really, REALLY tired of all this bs directed at marcan. Where was your mother doging at team twiizer when they hacked the wii? Where was your mother doging about the no leak no disclosure policy when they gave you the homebrew channel? The media player?
Where was your calling hector and/team twiizers incompetent moron hackers who dont know what they were doing when they gave wii users unbrickable wiis?
You werent mother doging then when code was withheld.
Only now that its convenient for you guys and your fantasies, now twiizers are a bunch of clowns who dont know what they are doing.
This crap is REALLY starting to piss me off.
#18
Posted 28 December 2012 - 04:09 PM
The psu is rated at 75 watts. Thats whats drawn from the wall, not all of that can be used, some of it is lost as thermal waste.
So we are looking at, at most, 60 watts, that goes to the wifi, both internet and gamepad, the disc drive, all the usb slots (although, it appears they are unpowered, better have a seperate power source for your hard drive, or hope a y cable is enough) flash mem, lights, ram, the audio dsp, the arm co/security processor... And the cpu and gpu.
No its not like that... if nintendo says wii U can go up to 75 with all usbs... And if the wii U psu is for example 80% efficiency... then its 75wats+20% = 100% of psu power draw +/- some extra watts for safety reasons... So my example for 80% efficiency is, 93watts about that +20-30watts for safety reasons... overal 120 watts psu... its going like that.. always..
For example my new psu for my pc is 90% efficiency gold 850watts... so its 850+10% (85watts) + 80watts certified for safety = 1015watts overall. But I can only use 850 watts because thats what I wanted to buy... an 850 psu... u see.. u cant sell to somebody an 850 watts psu with xxx% efficiency that may be the half of its raw power... and the guy who needs the 850 watts, is going to be useless for him... So you can easily sue the company for false advertisement hehehehe... I hope u understand.
Edited by Orion, 28 December 2012 - 04:21 PM.
#19
Posted 28 December 2012 - 04:18 PM
No, it does not work like that. Yes, it says 15 volts and five amps, yes that technically comes out to 75 watts output, but NO, that doesnt all make it to the system.
I was being VERY generous with conversion loss, as effeciency averages around 75%, which would get 56 watts.
I am getting really, REALLY tired of all this bs directed at marcan. Where was your mother doging at team twiizer when they hacked the wii? Where was your mother doging about the no leak no disclosure policy when they gave you the homebrew channel? The media player?
Where was your calling hector and/team twiizers incompetent moron hackers who dont know what they were doing when they gave wii users unbrickable wiis?
You werent mother doging then when code was withheld.
Only now that its convenient for you guys and your fantasies, now twiizers are a bunch of clowns who dont know what they are doing.
This crap is REALLY starting to piss me off.
Except most of the conversion loss is in the AC to DC stage, the 75W output is AFTER that and in fact I pointed out that for 75W output its not unusual to have drawing 90W input.
So thinking purely about the DC to DC stage, if you look at the specs for the PicoPSU at around the same power capabilities, its 86% efficient at 1A and 96% efficient at 5A.
Taking that as a guideline it would mean at full load the Wii U could actually use 72W, although its unlikely Nintendo bundled a PSU that was ever expected to be maxed out so its probably less than that. But even if we assume we end up with only 65W, that still implies the Wii U is not being maxed out, which you even said yourself is likely.
As for the clock speed argument, you seem to be glossing over the fact that even hackers are human, they make mistakes.
What about how Wii homebrew stopped working with newer Wiimotes because whoever reverse engineered how to talk to the Wiimotes, got it wrong. Yes it worked, but it wasn't they way Nintendo was talking to them so when Nintendo changed the later Wiimotes, it broke compatibility with the old method - as it was never intended to work like that.
They are human beings, reverse engineering hardware/software which is designed to obfuscate what is going on. They could easily be reading the clock speed from the wrong place. Heck, even on PC when developing open source drivers, developers often mess up things because of having to reverse engineer something and it turns out they went about it the wrong way. I can't even get the on-board voltage sensors to display right on my motherboard. I even have a motherboard where I had to disable power management because the OFFICIAL drivers for the Intel Ethernet chipset, don't work properly. Intel once released a CPU with a major bug that caused things to crash, human beings make mistakes.
So no, I am not BSing him at all, I highly respect the work of the hackers, they allowed me to have my games installed on the HDD for Xbox and Wii. However taking any leak obtained purely via hacking/reverse engineering as outright fact is just plain silly. You never know what they might discover tomorrow that completely contradicts what they though they knew yesterday. Not least the fact if Nintendo DID underclock in the firmware, would a hacker even be able to find out the native clock speed?
Edited by Alex Atkin UK, 28 December 2012 - 04:37 PM.
- Arkhandar likes this
Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK
#20
Posted 28 December 2012 - 04:30 PM
No its not like that... if nintendo says wii U can go up to 75 with all usbs... And if the wii U psu is for example 80% efficiency... then its 75wats+20% = 100% of psu power draw +/- some extra watts for safety reasons... So my example for 80% efficiency is, 93watts about that +20-30watts for safety reasons... overal 120 watts psu... its going like that.. always..
For example my new psu for my pc is 90% efficiency gold 850watts... so its 850+10% (85watts) + 80watts certified for safety = 1015watts overall.
No, all of you back to school.
1. I was EXTREMELY generous, effeciency drops like a rock at low power loads, extremely cheap mass produced psu's dont get 96% effeciency.
2. Maximum safe output >< what the system recieves.
3. Products typically weigh in using about 60% of the max output of the psu. No device has EVER used the max certified output, or even close.
4. I was already EXTREMELY generous, reel in the fantasies.
5. 1.25 Ghz is an EXTREMELY high clock speed for an architecture with a 4 stage instruction pipeline if you want to continue this clock speed nonsense FIND a 4 stage architecture clocked higher. Since its such a simple matter sutely you must be able to find lots of 2 and 3 Ghz processors with 4 stage instruction pipelines.
Edited by 3Dude, 28 December 2012 - 04:35 PM.
1 user(s) are reading this topic
0 members, 1 guests, 0 anonymous users