Jump to content


Eye_Of-Core's Content

There have been 38 items by Eye_Of-Core (Search limited from 25-September 23)


By content type

See this member's


Sort by                Order  

#249359 Wii U Summer/Fall Update

Posted by Eye_Of-Core on 02 October 2013 - 12:24 PM in Wii U Hardware

http://www.neogaf.co...postcount=10883 - open the link and read the comment... So I mentioned that September/October update could bump clocks and it seems I may have been right after all. Can anyone test the temperature and power consumption...

 

Wii U's normal power consumption is 30 watts not 40 watts, Neogaf user did not calibrated its watt meter and thanks 3Dude for info...

 

So please, if you can measure power consumption well then please do it... Also is the fan louder and is the Wii U box hotter or noticeably warmer?




#249282 Wii U hardware; bits and pieces connected together...

Posted by Eye_Of-Core on 02 October 2013 - 05:52 AM in Wii U Hardware

*facepalm*

 

I understand your opinions though I wanted a clean thread rather than update this one that was derailed. NVM.




#249063 Wii U hardware; bits and pieces connected together...

Posted by Eye_Of-Core on 01 October 2013 - 08:01 AM in Wii U Hardware

No, you spammed the same topic with practically the exact same information while the old topic was on the same page like a silly pony who cant read the TOS

 

I abandoned this thread, it was in active as hell...

 

Also how is it practically exact same information? It has information from this thread it also has a lot more information's and discoveries and claiming that it has practically exact same information is invalid? Was there calculation of GPU's power consumption? Was there comparison of Xbox 360 Xenon and Wii U Espresso CPU? Were informations involving bandwidth of DDR3 RAM and eDRAM in the previous thread I made? Were there comparison of latency between GDDR3 on Xbox 360 and PlayStation 3 to DDR3 in the WIi U? Pipeline, cache and execution units comparison between Xbox 360 and PlayStation 3? No? Then how is it practically exact same informations with that much detail in previous thread that I made? Was there anything, even a tiny bit of information involving these things? No? Then you now know that you failed... Learn to see difference between, read, read for once...




#249028 Why I want to develop for Wii U hardware

Posted by Eye_Of-Core on 01 October 2013 - 06:32 AM in Wii U Hardware

If you want help involving storyline, ideas and that.... I can help, even involving music... I knew couple of people that are non-mainstream legends in my country, my dad is one of them. Want punk, metal, reagge, country or something else... I could help.




#249019 Wii U hardware; bits and pieces connected together...

Posted by Eye_Of-Core on 01 October 2013 - 06:23 AM in Wii U Hardware

The max consumption of the console is not 40 watts. Those tests done by 'gamz girnolistz' were done with low grade, radio shack, crap brand watt testers, which werent even calibrated.

Recent tests were conducted with professional products, properly calibrated, and show the system does indeed flux when playing games, and that the 'gamz girnolitz' tests were off at times by as much as 10 watts.

Wait a sec, is this a new thread? What happened to the old one? Still on the front page... There, merged.

 

I abandoned the previous thread since I wanted a new thread clean from derailing, now you united this and that thread and screwed everything. Now people can't find the new thread. You screwed a lot of people now, good job. You should have asked first and then act after... Jumping on the gun like a fool




#248989 Wii U hardware; bits and pieces connected together...

Posted by Eye_Of-Core on 01 October 2013 - 05:00 AM in Wii U Hardware

Next gen is just a term for a new generation of consoles. Ofcours the Wii U is next gen.

 

It's not something hardware related imo.

 

Of course I know that and please do not derail the thread, this is about the hardware and technological aspects since some people say Wii U is not "next gen" because of hardware and this thread shows that it is next generation so it is next generation in a term since it is successor to Wii and is hardware wise compared to previous/7th generation consoles. There are still people out there saying that and at least I can do is spread the knowledge...




#248986 Wii U hardware; bits and pieces connected together...

Posted by Eye_Of-Core on 01 October 2013 - 04:47 AM in Wii U Hardware

Wii U CPU:
- Tri Core IBM 45nm PowerPC750CL/G3/Power7 Hybrid
- Core 0; 512KB Core 1; 2MB Core 2; 512KB L2 Cache
- Clocked at 1.24Ghz
- 4 stage pipeline/not Xenon-Cell CPU-GPU hybrid
- Produced at IBM advanced CMOS fabrication facility
- eDRAM is L2 cache embedded in CPU(Power7 memory implementation)

*Wii CPU core was 20% slower than Xbox 360 core, since Wii U CPU is modified/ehanced and clocked 65-70 percent higher thus two Wii U cores should be on par/equal exceed all three Xbox 360 cores or if all three cores are used then Wii U CPU is 50+ percent stronger/faster than Xbox 360 processor and faster than PlayStation 3 processor. http://gbatemp.net/t...-7#post-4365165

 

*X360 Xenon: 1879.630 DMIPS*3 = 5638.90 DMIPS @ 3.2 GHz vs Wii U Espresso: 2877.32 DMIPS*3 = 8631.94 DMIPS @ 1.24 GHz

 

*X360 Xenon 32-40 stage pipeline vs Xbox One/PlayStation 4 Jaguar 16 stage pipeline vs Wii U Espresso 4 stage pipeline

 

*X360 Xenon in-order 3 Execution Units vs Xbox One/PlayStation 4 Jaguar out-of-order ? execution units vs Wii U Espresso out-of-order 15 execution units

 

*X360 Xenon 1MB shared L2 Cache vs Xbox One/PlayStation 4 4MB shared L2 cache vs Wii U Espresso 2MB/512KB/512KB L2 cache

 

Wii U CPU is next generation compared to Xbox 360 and people that say otherwise should just shut up, be ashamed, have a seat, accept the facts and deal with it and here is the proof; http://systemwars.com/forums/index.php/topic/112794-no-freaking-way-40-stage-pipeline/

 

*Since Xbox 360 has 32 to 40 stage pipeline and PlayStation 3 also has 32 to 40 stage pipeline thus it has severe penalties in terms of useable performance while Wii U with 4 stage pipeline is in comparison far more efficient thus more can be used from it and imagine how bad Xbox 360/PlayStation 3i s bad in comparison to Wii U;

 

 

*Wii U CPU codenamed Espresso could actually be three to four times faster than Xbox 360 Xenon since Xenon has less cache, way less execution units and way too long stage pipelines that is 8 to 10 times longer and this is awful for a lot of tasks and it is in-order versus Espresso that is out-of-order thus better for predicting code, better for AI, AI path finding and so on...

 

- Dual Core ARM Cortex 8 for background OS tasks clocked at 1Ghz with 64KB L1 cache per core and 1MB of SRAM as L2 Cache, evolution of "Starlet" chip
- Single ARM9 "Starlet" core for Backward Compatibility, rumored to have higher clocks

Wii U Memory:
- DDR3 2GB, 1GB for OS, 1GB for Games
- eDRAM 32MB VRAM + 4MB Gamepad + 3MB CPU L2 cache
- Clocked at 550Mhz
- eDRAM act's as "Unified Pool of Memory" for CPU and GPU thus practically eliminating latency between them

 

*Wii U's DDR3 RAM bandwidth has theoretical maximum of 51,2GBps since it has four 512MB chips and not one large 2GB chip so anyone thinking that its maximum bandwidth is mere 12.8GBps is a tech illiterate. Xbox 360 had theoretical maximum of 22.8Gbps though it has a bottleneck that turns it down to mere 10Gbps thanks to poor FSB.

 

*Xbox 360 has GDDR3 RAM that is bottlenecked by Xenon's FSB so it can not saturate theoretical maximum of 22.8Gbps since FSB can handle 10Gbps and latency of GDDR3 is atrocious compared to DDR3 RAM in Wii U, latency is very important for the CPU since the lower the latency the faster transfer between CPU/GPU and RAM and PlayStation 4 will have similar issues as Xbox 360 when comes to latency since GDDR5 is successor to GDDR4 that is successor to GDDR3 and all of them have higher latency than DDR3.

 

Wii U GPU:
- VLIW 5/VLIW4 Radeon HD 5000/6000 40nm
- DirectX 11/Open GL 4.3/Open CL 1.2/Shader Model 5.0
- Supports GPGPU compute/offload
- Customized, Using custom Nintendo API codenamed GX2(GX1 Gamecube)
- Clocked at 550Mhz
- Produced at TSMC advanced CMOS 40nm
- Uses 36MB eDRAM as VRAM
- 4MB of eDRAM is allocated for streaming feed for gamepad
- GPU is customized and according to modification of GPU's in their previous consoles we can presume that Nintendo won't waste any mm^2 to unneeded features and customized to fit own needs.

*Eyefinity is present on AMD cards since Radeon HD 5000 so it is at least Radeon HD 5000 w/ Terascale 2

*If it is Radeon HD 4000 series and has 320SPU's then it is Radeon HD 4670 55nm though since Wii U uses Eyefinity and GPU is 40nm thus being Radeon HD 4000 series is invalid.

*Radeon HD 6000 was released in Q3 2010, final Wii U silicon was finished in Q2 2012 and released Wii U in Q4 2012. Looking at gap between E3 2011 and final Wii U silicon Q2 2012 plus Radeon HD 6000 evolved from Radeon HD 5000 that evolved from Radeon HD 4000 thus I presume switching to a newer though similar GPU and architecture was not a problem and all of these GPU's were produced at TSMC's fabs.

*Wii U from E3 2011 and its development kits had Radeon HD 4850, it is rumored that Wii U's newer development kit replaced 4850 with modified/customized/cut down Radeon HD 6850.

*Radeon HD 6850 has 1.5Tflops of performance at clock of 775Mhz with 1GB of GDDR5 VRAM and bandwidth of 130Gbps

*GPU in Wii U is clocked at exactly 30% lower at clock of 550Mhz and if it has 1/3 of SPU's thus it has 0.352Tflops. 36MB of eDRAM with 70-130/275/550Gbps bandwidth, 2-player co-op as for example in Black Ops 2 Zombie Mode is using Eyefinity(?) to stream two different in-game images/views.

 

*Since eDRAM in Wii U's GPU codenamed Latte is embedded thus its theoretical maximum bandwidth of 275Gbps to even 550Gbps; http://www.ign.com/b...#post-481621933

 

*90nm 16MB eDRAM can do 130Gbps bandwidth, 45n 32MB eDRAM in WIi U should do the same plus since CPU's Cache and GPU uses eDRAM so latency is much much lower and AI can be offloaded to GPU, when embedded into GPU then it should do 275/550Gbps

Wii U Note;
- Can’t use DirectX 11 because of OS, only Open GL 4.3/Nintendo API GX2 that is ahead of DX11
- Easier to program, no multiple bottlenecks causing issues as on Xbox 360 and PlayStation 3
- Efficient hardware with no bandwidth bottlenecks
- Launch games and ports use 2 cores and only a couple of ports/games use 3rd core to a decent degree
- Wii U CPU has much higher operations per cycle/Mhz than Xbox 360/PlayStation 3, it is unknown if it is faster
- Wii's CPU core was slower
- Minor Power7 architecture implementation involving memory allows shave off 10 or more percent of texture size without loss of quality(special compression or something)
- Wii U power consumption at full system load is 40 watts(highest load ever possible in its current state)
- Wii U's PSU/power brick is rated 75 watt and has 90% efficiency thus it can handle 67.5 watts
- Wii U's Flash storage, RAM, USB ports, motherboard, fan and small secondary chips consume around 5 to 10 watts in total when fully stressed/used
- Wii U's SoC(CPU and GPU) estimated maximum power consumption is 30 to 35 watts
- Supports 4k displays, native 4k via HDMI 1.4b (possible 2D 4k games?)

*It is maybe in fact most efficient performance per watt in the world in terms of 45/40nm chips/SoC's

 

*Wii U's Power Brick/PSU is rated 75 watts and has efficiency of 90% so it can handle at max 68 watts without serious degrading and since Wii U consumes 40 watts there is available 28 watts though I would dare only to bump power consumption to 60 watts or 20 additional watts if I could increase performance.

 

*Since maximum power consumption of Wii U is 40 watts and from my calculation GPU consumes roughly mere 10 watts then Wii U's CPU could consume 15 to 20 watts and rest of system around 10 to 15 watts depending on how much Wii U's CPU consumes since it is an unknown factor. I am not counting any possible customizations on the Wii U's GPU, this is all rough estimations and we don't know the whole picture. Wii U is a beast when looking at what nm process was built on and probably most efficient machine on that nm process in the world.

In case you are wondering why some games run on Wii U worse compared 7th generation consoles then I have simple explanation; Wii U hardware is noticeably different than Xbox 360's or PlayStation 3's because their processors and not true CPU's since they can operate GPU related tasks well compared to Wii U that is primarily a CPU. Another reason is that developers don't put spend resources and time on porting the game for Wii U thus it lacks proper optimizations and adaptions of their game engines to Wii U's hardware. Even though some ports perform lower than on 7th generation consoles, in most cases run on higher resolution and/or at Native 720p/HD.

Most if not all 3rd party launch games were made on older Wii U development kits that had 20 to 40% lower clocks than final development kit that Nintendo released near the launch so developers did not had much time to adapt their games to the final devkit thus games were running poorly also games like Darksiders 2, Warriors Orochi 3 Hyper, Batman Arkham City and other were using only one to two cores while third was not used at all or was barely used to aid performance of the game involving CPU related tasks. Since most ports are from Xbox 360 and/or PlayStation 3 versions of the games there are sure to be incompatibilities since Xbox 360 and PlayStation 3 Processors do CPU and also GPU tasks plus GPU's, RAM/Memory, Latency and other factors are different than on Wii U thus optimizations and adaptations is needed though cheap ports as always tend to be a train wreck. Don't you agree?

 

Xbox 360 and PlayStation 3 will be supported for next three years and this is in a way a negative thing since it can really hold back performance of games on Wii U if those games are going to be mostly port from Xbox 360 and PlayStation 3 since the architecture on these two consoles are vastly different compared to Wii U. We may only see Wii U shine after three years when Xbox 360 and PlayStation 3 stop being supported, those two consoles will hold back development and time spent on Wii U.

Wii U may not be a "leap" as Xbox One or PlayStation 4 though it is a leap over Xbox 360 and PlayStation 3 when looking it very roughly and when taking into account all the implementations, features and "tricks" then the gap is even bigger. Wii U has more embedded RAM than Xbox One that has 32MB of eSRAM while Wii U has 36MB of superior eDRAM for GPU also eDRAM blows away GDDR5 in PlayStation 4 in terms of bandwidth if I am correct? 130/275/550Gbps on Wii U versus 80Gbps on PlayStation 4?

We need to take in consideration that Wii U's CPU Espresso has a certain implementation from Power7 architecture that allows usage of eDRAM and we know that CPU in Wii U has total of 3MB of eDRAM as L2 Cache and it could also use main eDRAM pool as L3 Cache and maintain connection with GPU thus creating HSA/hUMA like capabilities and reducing latency by far between CPU and GPU communications and data transfer.

Wii U's GPU was Radeon HD 4000 series in very first alpha development kit and that was Radeon HD 4850 that was 55nm and not RV740 40nm and by that time of first unveiling of Wii U there was Radeon HD 6000 on the scene for nearly a year or now almost three years so Nintendo could easily switch to Radeon HD 6000 series since it is basically evolution of Radeon HD 5000 that is refinement of Radeon HD 4000 series and further it is supported by using Eyefinity features on Wii U's gamepad that was proven by Unity demo on Wii U and Call Of Duty Black Ops 2 when in co-op in zombie mode also Wii U can stream up to two images at gamepad though it hasn't been used and maybe it will be used in near future.

Also people seem to forget about power consumption of dedicated GPU's versus embedded into motherboard ones that naturally consume less plus Wii U's GPU uses eDRAM that consumes 1 to 2 watts max compared to GDDR5 that consumes around 9 or more watts per 2Gb. GPU in Wii U is embedded thus it does not use PCI-E, additional PCB and chips plus has embedded eDRAM in it so consumption of eDRAM could be negated thus power consumption of eDRAM could be reduced to a literal zero.

Lets take for example Wii U's GPU die size and Radeon HD 6970 die size and assume that Wii U GPU is VLIW 4 based Radeon HD 6000 series GPU and not VLIW5. Radeon HD 6970 consumes 250 watts maximum and has die size of 389mm^2 and 2Gb of GDDR5 and is clocked at 880Mhz. Lets reduce it to 320 SPU's that will use 80mm^2 thus consumption is lowered to roughly 83 watts then we remove 2GB of GDDR5 and it goes down to 70-73 watts, now lets lower down the clock of it from 880mhz to 550mhz so that is roughly 35% lower clocks and when clocks are 25% then power consumption is cut in half since it is 35% so the GPU consumption goes down from 70/73 to roughly 14-15 watts without being embedded and we could easily shave off couple of more watts and it would most likely go down to 10 watts.

 

We can not really compare Wii U's GPU to any off-shelf/standard dedicated GPU, since some features that are found in regular dedicated GPU's are not needed when embedded thus with some minor modifications I can see Wii U having 384 SPU's thus at 550Mhz should have 420 Gflops rather than 320 SPU's if it was a standard  dedicated GPU with die size of it. If Nintendo was to do drastic modifications they could even reach 500SPU's and very close to 600gflops. One of homebrew hackers counted 600 shader's so I am wondering if Nintendo is maybe using one technology that AMD has never used that was from ATI that they also never used and they call it "high density" that is going to be used in AMD's Steamroller cores, from what information AMD released... "High Density" can increase density of the chip by 30% and reduce size by 30% and reduce power consumption.

 

*I won't link most of this information's though I can assure you I did a lot of research and digging on the internet, collecting bits and pieces and then putting the together into one picture thus I had felling that is called "a ha!" or "EUREKA!!"




#248363 Wii U Summer/Fall Update

Posted by Eye_Of-Core on 27 September 2013 - 03:47 PM in Wii U Hardware

I am sorry that you continue to fail...




#248344 Wii U Summer/Fall Update

Posted by Eye_Of-Core on 27 September 2013 - 01:44 PM in Wii U Hardware

I'm sorry, I call that fanboy BS. I own a Wii U since day one and I can confirm that that's just wishful thinking.

 

I am sorry that I can't stop you from going full r*****, if that was wishful thinking then there would not be droves of people of forums claiming that after OS update that their games had no stuttering and/or frame drops eg more consistent frames. You most likely did not noticed, you are probably only a few or never owned couple of games like that had issues with frame rate and got resolved after OS update.

 

Game performance can be affected by OS, don't be in denial you tech illiterates.




#248312 Wii U Summer/Fall Update

Posted by Eye_Of-Core on 27 September 2013 - 10:03 AM in Wii U Hardware

Do you honestly believe that Nintendo would rather pay for an extra gig of ran than a better flash storage unit? It's just an utter lack of optimization.
Iwata himself admited in the Wii U Iwata Asks that once they had developed all these Wii U services, they had a really a bad time sticking everything together into the OS. That not only translated into an horrible UI, but also resulted in wasted hardware resources, which is probably the case with the internal memory speeds and the 1GB of RAM allocated to the OS.

Source?

 

Why do you need source, if you own a Wii U then you should have experienced that hands on... Many people noted that games load faster and ran better.




#248256 Wii U Summer/Fall Update

Posted by Eye_Of-Core on 27 September 2013 - 03:06 AM in Wii U Hardware

OS is most likely the problem, do you seriously think that Wii U's OS is not the problem I mean when it was updated then games were less laggy and had less frame drops and more stable frame rate. Look at Windows and Linux based OS'es, the first one is bloatware and the second one is more streamlined and less of a resource hog also we need to take in account that this is really the first real OS that Nintendo is working on and their first home consoles that is multi core and first Nintendos platform that has three cores compared to 3DS that has two main cores and DS that has main core and secondary weaker core.

 

If the problem is the RAM alocated to OS why they don't alocate it all and when game is running then cut it down to lets say 512MB of RAM usage...




#248155 Lack of demand is keeping Soul Calibur 2 HD from seeing a Wii U/PS Vita release.

Posted by Eye_Of-Core on 26 September 2013 - 10:18 AM in General Gaming

Wasn't the Gamecube version most sold or what? Namco Bandai is are bunch of bakka, act like gaijins.




#248154 Wii U Summer/Fall Update

Posted by Eye_Of-Core on 26 September 2013 - 10:12 AM in Wii U Hardware

It seems that Wii U's OS is not perfected yet, games downloaded digitally suffer from frame drops and I can only point out to the OS and September/October update could be major patch for OS. I was searching the internet for more information on Wii U's GPU and I found this;

 

http://www.ign.com/b...775697/page-199

 

So couple of guys at IGN forums did some research, had some findings and speculated... So guys at Beyond 3D that say Wii U's GPU is 176Gflops are way off, guys at NeoGAF had a rough estimate based on amount of SPU's that it has 352Gflops and guys at IGN forum considered if some things are cut and if it is based around Radeon HD 6000 series that it could have 500 to 600 Gflops performance of 2x Xbox 360 or 3x PlayStation 3 GPU performance.




#248120 Wii U Summer/Fall Update

Posted by Eye_Of-Core on 26 September 2013 - 05:33 AM in Wii U Hardware

How can I be in denial? Nintendo them self proclaimed that September/October update is "speed/performance" update and they can further optimize Wii U's OS also how you can act like hypocrite and think that I don't know that performance of games in the end must be done by game developers? You think I am tech illiterate or your ego is so big that you are blind?

 

Why it would not be possible to bump the clocks? Nintendo can and it is not limited by PSU/power brick since if it can only do 45 watts then it would be a 45 watt PSU/power brick and not a 75 watt and guys at Neogaf measured power consumption of Wii U with tools that are worth over 2000$ and it showed that Wii U consumes 40 watts, Eurogamer measured with much cheaper and less reliable tool and they got 33 watts... It is 40 watts and they have enough room to bump the clocks by 10 to 20 percent on both CPU and GPU, please don't talk about PSU/power brick degradation because it is debunked. As long as Wii U maximum power consumption is around 67.5 watts in theoretical 10 year life span then I can say the PSU situation will be just fine




#247996 Wii U Summer/Fall Update

Posted by Eye_Of-Core on 25 September 2013 - 02:18 PM in Wii U Hardware

It seems you two guys are in denial, are you pretendos?

 

DS and 3DS had locked hardware features and later when it was unlocked, either games and OS performed better. Wii U at maximum consumes 40 watts while PSU is 75 watts so they have head room to increase clocks, specially clocks of the GPU. We know that Wii U's OS was kinda wonky and now it is okay, so Wii U could be rushed and if that is the case then maybe some features were not been usable and stable because of OS.




#247901 Wii U Summer/Fall Update

Posted by Eye_Of-Core on 25 September 2013 - 04:21 AM in Wii U Hardware

There are two most realistic possibilities...

 

1. They bump the clocks of CPU and GPU

 

2. They increase speed/reading of the drive if possible




#247667 Shin’en: It’s not the hardware’s fault if devs can’t create good looking games

Posted by Eye_Of-Core on 23 September 2013 - 01:53 PM in Wii U Hardware

Need For Speed Most Wanted for Wii U actually used properly some capabilities of the WIi U, it had in some PC like textures though only on one or two sides of the buildings and it had better and more refined lightning.




#247664 Wii U Summer/Fall Update

Posted by Eye_Of-Core on 23 September 2013 - 12:29 PM in Wii U Hardware

So now Nintendo can unlock something that doesn't even exist? Priceless.

 

That is just a theory, they could be more SPU's and we don't have any die shot of Radeon HD 6000 series and we don't know how many transistors it has nor how much it has been customized, what is the real difference between the GPU in Wii U and off-shelf GPU. Nintendo may implement/unlock option to reduce OS usage from 1GB to 512MB if necessary so developers could use 1.5GB plus they can bump clocks higher for either CPU and/or GPU.

 

I know that Wii U's CPU has Power7 memory architecture though I don't know if its available for 3rd party to use a feature/"shortcut" between CPU and GPU, Wii U's CPU uses eDRAM for L2 caches and Wii U's GPU uses eDRAM as VRAM so using same memory means that Wii U has unified pool of memory or hUMA/HSA like architecture and that will allow to offload AI pathfinding, level rendering and other things to GPU and reduce load to CPU that has been doing tasks that worked better on GPU.

 

Also Nintendo could asked AMD one piece of technology that AMD has but never really tried to use in a commercial product, I forgot the name of it but it allows much higher density of transistors and it is planned to be used in near future by AMD so AMD may used a chance with Nintendo to use this relatively unproven technology/process on Wii U eg Wii U is a lab rat for it.

 

http://media.bestofm...U-350382-13.png

 

AMD calls it high density it was created by ATI and AMD bough it in 2006, ATI now AMD designed GPU's for Gamecube, Wii, Xbox 360 and now Wii U, PlayStation 4 and Xbox One.




#247638 Wii U Summer/Fall Update

Posted by Eye_Of-Core on 23 September 2013 - 08:58 AM in Wii U Hardware

[The ratio (around 60% of the PSUs rated load as I recall) makes sense if Nintendo were taking into account PSU aging which seems likely, as Nintendo are known for making very reliable hardware that just keeps on going for years and years.]

 

http://www.tomshardw...ing-calculators

 

We can ignore it and Nintendo could rated it 75 watt with in mind that PSU ages so it could be higher. it is maybe 5% in its entire life time.

 

[When I compared the Wii U power consumption against the maximum the PSU is rated to output, the ratio was around the same as the Wii power consumption against its PSUs rated maximum.  That suggests to me that the Wii U is drawing close to its design power already.]

 

I must disagree.

 

[So yes the load may increase once developers really start tapping into all the hardware, but I don't think a firmware update is going to magically unlock more SPUs.  For one thing, the scan of the chip has been extensively examined and 320 SPUs was decided to be the most likely MAXIMUM based on its structure.  If it had double or triple that number, I'm pretty sure the experts would have seen that.]

 

Did you forgot that the GPU inside Wii U is heavily customized and I know it is VLIW4 base plus there are no VLIW4 GPU's of any kind using eDRAM or having exactly 320 SPU's. Nintendo is known for doing heavy customizations to the GPU and they will not waste any mm^2 of silicon thus they will resort to removing features and some obviously unneeded also I looked at die shot and I can't find on what eletroscope it was taken and can we even see a single transistors? If it was taken on eletroscope that can go below 40nm then we could have seen individual transistors and other features.

 

Also it is 40nm and made on more mature and refined process than original VLIW4 Radeon HD 6000 series, there was a rumor it is based around HD e6850 and very first alpha Wii U devkit had Radeon HD 4870.

 

Wii U's GPU is 146mm^2 and we remove eDRAM and small eDRAM block and we have 100mm^2 for GPU and Radeon HD 6850 is 250mm^2 and has 960 SPU's and having just 320 SPU's would be 83mm^2 though we know it is customized and not off-shelf GPU so I am sure there will be more SPU's and we know that GPU is produced at mature 40nm Advanced CMOS at TSMC so performance is higher and power consumption is lower.

 

Gah i won't argue with you.




#247631 Wii U Summer/Fall Update

Posted by Eye_Of-Core on 23 September 2013 - 07:52 AM in Wii U Hardware

So your saying the October update unlocks some unused part of the WiiU that suddenly makes it triple the flops and more able to compete with Xbone?

Surly not, why would Nintendo let everyone think the.WiiU was a low power usage console that's 'just' powerful enough for near on a year of its life.

 

I don't know, Nintendo has tendency to lock some things and then unlock them later like on DS and 3DS.




#247626 Wii U Summer/Fall Update

Posted by Eye_Of-Core on 23 September 2013 - 07:27 AM in Wii U Hardware

I did some research, released it this forum and will most likely make a new thread with updated information's.

 

Evidence points out that Nintendo is going to allow/unlock shaders on the Wii U that were detectable in Wii mode via hacks, eg currently 3rd party developers can access 320 SPU's that has 352Gflops performance while the September/October update should give full usage to 960 SPU's and 1056Gflops or 1.05Tflops of performance and maybe push Wii U's power consumption higher eg from 40 watts to 70 watts if 40 watts power consumption was w/ shaders that were all active or inactive if inactive then we can expect higher power consumption.

 

About Wii U's storage, it has flash based that is twice to two and a half times faster than USB 2.0 or HDD's inside Xbox 360 and PlayStation 3.

 

At least with Wii U you can have good old school way of playing your games, put a dvd in and play without installs. :)




#247153 Wii U hardware; bits and pieces connected together...

Posted by Eye_Of-Core on 20 September 2013 - 08:42 AM in Wii U Hardware

Shocking.

 

Yea... I know.! duh *sarcasm lol




#247137 Lets analyze and compare Bayonetta 2' gameplay

Posted by Eye_Of-Core on 20 September 2013 - 06:53 AM in Wii U Games and Software

Thats actually a good one. It is platinums latest ps360 game, and of similar genre.

I might start looking into it when I get home.

 

That would be great, thanks in advance.




#247135 Lets analyze and compare Bayonetta 2' gameplay

Posted by Eye_Of-Core on 20 September 2013 - 06:41 AM in Wii U Games and Software

Can you do comparison between MGR:R and Bayonetta 2? Please...




#247123 Wii U hardware; bits and pieces connected together...

Posted by Eye_Of-Core on 20 September 2013 - 05:34 AM in Wii U Hardware

There is a thread on this forum comparing Bayonetta to Bayonetta 2 started by 3Dude and is very well done, good read. 

http://thewiiu.com/t...tta-2-gameplay/

 

Thanks for the link :)

 

Wii U trashes 7 generation, its not even fair.... We seriously need to move on.





Anti-Spam Bots!