Jump to content


esrever's Content

There have been 20 items by esrever (Search limited from 27-September 23)


By content type

See this member's

Sort by                Order  

#139817 PotatoHog's daily lesson on the Wii U CPU/GPU situation

Posted by esrever on 28 November 2012 - 02:53 PM in Wii U Hardware

Is it just me or does it really sound like some random guy went to the POWER7 Wikipedia page and just took the maximum amount of eDRAM he could and posted it on Neogaf? (please remember that the eDRAM 32MB rumor appeared alongside the POWER7 CPU bonanza)

the quote was for the other guy who didn't know that the edram in the cpu was for an L3 when IBM uses it. It has nothing to do with amount in the Wii u. The CPU eDram is probably less than 4mb.

its not calculations, but speculations... If its slower than ps3 and 360... then its lower than 30 gflops... I am very generous... New articles pop up, saying that wii U cpu runs at 1.2ghz... hahaha this is worse than mine.

Look at this article.. very interesting!!   http://www.eurogamer...-wii-u-face-off

you can see even the fps meter between the 3....

the 1.2ghz rumor doesn't make any sense. The faceoff is for unoptimized code and with a lot of other limitations in place. you can't just speculate the flops you can get out.



#139812 Wii U 'Has A Horrible, Slow CPU' Says Metro Last Light Dev

Posted by esrever on 28 November 2012 - 02:45 PM in Wii U Hardware

No, YOU didnt hear about ps3 difficulties until later on down the road. Devs were whining about the horribly ineffecient cpu's in the ps3 AND 360, in order deep pipelined, crap branch prediction, with a 500 cycle miss penalty, all the way up until the point the wii was announced.

And the system draws 75 watts from the wall, depending on the effeciency rating of the psu, we are probably looking at a max draw of 60 watts. The system draws 33 watts when its not doing anything, and 35 watts when playing nsmbu (LOL). Id hardly consider that putting the system under an average load, you DO know every system ever made draws more power if the game is more demanding, or do you SERIOUSLY think the wii draws the same power playing nsmb vs skyward sword, or a 360 uses the same power when playing geometry wars vs halo 4. Your logic is extremely poor.

Ha ha. Are you serious? 1.8x optimization performance is a serious challenge on a new system?

Shin en: 'For instance, with only some
tiny changes we were able to optimize certain
heavy load parts of the rendering pipeline to 6x
of the original speed, and that was even without
using any of the extra cores.'

600% performance increase. on 1 core. with 3 cores on parellelizable code, 1800% performance increase.

Audiokinetic wwise middle ware, 2012 v2 (first optimizations made to wii u support.)

Wii U™
Audio on Wiimote and rumble on Wiimote and
DRC is now supported. Delay, Peak Limiter and
RoomVerb have been optimized by 2.3x to 4.4x
on Wii U.

230% to 440% performance increase. (this is for porting games from ps360 using the wwise middleware, so they are using the cpu and not the audio dsp.)

http://www.audiokine...news/227-2012-2

broken code for the optimizations. The in order pipelines were common place for anyone coding for embbed system/consoles or mobiles. The complains were mostly due to having to deal with the PPU to work anything on an SPU. You seem to be throwing things around that you don't understand. That is all Im going to say.



#139808 Wii U GPGPU?

Posted by esrever on 28 November 2012 - 02:40 PM in Wii U Hardware

The 360 cpu wasn't bad. It was designed for getting a decent foat point performance which is what you needed for gaming. The shared cache and such is where the most problem was but it was still the most efficient console cpu of this generation. The CPU in the 360 is about as powerful as a modern atom for general purpose code but that is still pretty good for what it needed to do which was heavy floatpoint. The memory optimizations in software is what is used to by pass the inefficiencies.
The gpu/cpu memory acess worked well enough with the implementation since the gpu is really the only one acessing memory a lot in that situation.

The Wii-u's cpu is much worse. From what I can gather, the memory archietecture is ancient along with very little fast cache means its akin to a pentium 3. It does have a edram to buffer the memory but that is slower than the cache and can only be used as a high level buffer to memory. It doesn't make up for all the cache misses. The prefetching is also slowed down due to the 12.8mb/s main memory.

The edram in the gpu would be used for frame buffers which allows for AA. It can do higher resolution and high AA levels than the 360. For gpgpu, it is very unlikely that the gpu edram is used because the system ram is where the memory is stored and accessing that would still have to go throu the CPU.



#136998 PotatoHog's daily lesson on the Wii U CPU/GPU situation

Posted by esrever on 23 November 2012 - 03:13 PM in Wii U Hardware

This guy.

Sony drones are expecting some master race system with games running at 4k resolution. There will not be a jump like with previous generations.

sony said themselves that the ps4 aims at 1080p 60fps for all games.



#136912 Wii U GPGPU?

Posted by esrever on 23 November 2012 - 12:24 PM in Wii U Hardware

judging by the size and the quote for DX 10.1, the gpu would be most likely a custom 4670 die with eDRAM.



#136910 PotatoHog's daily lesson on the Wii U CPU/GPU situation

Posted by esrever on 23 November 2012 - 12:22 PM in Wii U Hardware

The A10 is about as fast as the Wii u's gpu if they re-engineer the memory interface which they would have to for shared memory on a console.



#136613 If the wii U gamepad was it's own system, would you still buy it?

Posted by esrever on 23 November 2012 - 12:30 AM in Wii U Hardware

No. It would litterally be a vita with nintendo stamped on it.



#136608 THQ clarifies 4A's comment on CPU

Posted by esrever on 23 November 2012 - 12:15 AM in Wii U Hardware

You can pretty much deduced the same thing from yesterday's comment. This is just with the PR fluff added.



#136606 Wii U 'Has A Horrible, Slow CPU' Says Metro Last Light Dev

Posted by esrever on 23 November 2012 - 12:13 AM in Wii U Hardware

So after watching Shokio's latest video concerning the CPU dilemma, he did raise up a very good point with the Wii U's GPGPU.

Here is the Wikipedia definition;
"General-purpose computing on graphics processing units (GPGPU, GPGP or less often GP²U) is the utilization of a graphics processing unit (GPU), which typically handles computation only for computer graphics, to perform computation in applications traditionally handled by the central processing unit (CPU).

It seems as though the Wii U's GPGPU can also compensate to do the tasks a CPU would require. And from what we know the Wii U's GPU has been praised... I think; correct me on that one.

yes the gpu is the strongest part of the system. GPGPU will allow physics and effects to be calculated on the gpu. Things like pathing and AI will still need to be on the CPU. Its definately something that can be implimented to ease the CPU.



#136600 Slow CPU Will Shorten Wii U's Life - DICE Dev

Posted by esrever on 23 November 2012 - 12:10 AM in Wii U Hardware

You are confusing l1, with l2 (both within the processor core of this design there is no edram die, its all IN the processor core, thats what makes this ibms NEW ram technology READ THE DOCUMENTATION). The l2 buffers main memory so the l1 doesnt get caught waiting for the slow main mem to be available. Again, its called a memory hiearchy.

There is Nothing hot about power 7, its one of the most thermal and power savvy designs ever made (which is a real surprise after the break up of power mac), it has fantastic performance per watt which SCALES LINEARLY, for the love of cheese WILL YOU READ THE DAMN DOCUMENTATION. You dont even have to read past page 1, how lazy can you be?


Power 7 was designed as a high performance low power low thermal envelope processor. They fit 8 cores in the same power thermal and socket size as the dual core (that means 2) power 6. Its 1/4 as hot, (talking per core), takes 1/4 the power, clocked over a Ghz slower, and delivers over twice the performance.

The way they accomplished this was with a new edram technology, as stated by ibm, in the official ibm documentation i posted,That you are refusing to read, the same ram technology ibm confirmed when they announced they were making quote 'an all new' cpu for wii u, said press release also previously linked to by me. This is why ibm watson said it shared the same technology. The edram technology is what made power 7 possible at such a small size per processor, low thermal envelope per processor, and at a lower clock speed than power 6.

Power 7 chips are NOT 200 watt monsters. You are talking about the power 7 server PACKAGE, which contains 8 power 7 processors two MASSIVE memory controllers, 32Mb l3 cache, and a HUGE i/o strip. Its in the documentation. IF YOU WOULD READ IT. Also in the documentation are smaller p7 packages for custom customers, including a half socket 1 or up to 4 core reduced pin package.... as stated in the documentation. Its a very adaptable processor, ibm will get a lot of mileage put of it and its dirivitives

Power 7, or processors based on its technology dont have to be large, hot, or even powerful. It simply gets fantastic performance per watt. If you only give a few watts, it outperforms processors in that power range because it gets better performance per watt.

Seriously, read the documentation.

IBM denied the watson comment.

http://en.wikipedia.org/wiki/POWER7
  • 45 nm SOI process, 567 mm2
  • 1.2 billion transistors
  • 3.0 – 4.25 GHz clock speed
  • max 4 chips per quad-chip module
  • 4, 6 or 8 cores per chip
  • 4 SMT threads per core (available in AIX 6.1 TL05 (releases in April 2010) and above)
  • 12 execution units per core:
  • 2 fixed-point units
  • 2 load/store units
  • 4 double-precision floating-point units
  • 1 vector unit supporting VSX
  • 1 decimal floating-point unit
  • 1 branch unit
  • 1 condition register unit
[*]32+32 kB L1 instruction and data cache (per core)[13]
[*]256 kB L2 Cache (per core)
[*]4 MB L3 cache per core with maximum up to 32MB supported. The cache is implemented in eDRAM, which does not require as many transistors per cell as a standard SRAM[5] so it allows for a larger cache while using the same area as SRAM
[/list][/list]Any reason why the power7 chip is 17x the size of the Wii U's cpu?

Please understand that the edram is an L3 which buffers main memory in the power7. The L2 buffers the L3. With the size of the Wii U's cpu, there would be very little L2 if any at all. Please do some basic research.

The documentations you linked didn't really say anything.

Thanks for the explanation (again). Now even though I probably won't understand what's being said, is there a link to the documentation you mention that can serve a resource?

There's more misinformation being spread about the Wii U's hardware than not. It's great to have someone who understands why Nintendo went the unconventional route with the Wii U and the potential advantages.

for now, all we know is the chip is about 1/3 the size of the 360's cpu using the same manufacturing process. No other documentations have been release. Everything else is either rumor or marketing hype as far as Im concerned. "The same processor technology found in watson" can be just about anything. Could mean that both use electricity for all the good it says. There is definately some eDRAM(not sure how much), which is just a type of memory very close to the cpu core, it ussually used to buffer main memory from CPU cache or in special fixed logic fuctions(very unlikely for a cpu to have this). eDRAM allows for more cache space and is faster than SRAM when used in large quantities. This makes for a faster and larger L3 in designs such as the power7.

The main advantage of the Wii U is the gpu and the large memory bank compared to the current consoles. The gpu is expected to have about 2x the shader performance of the 360 as well as more modern shaders. There is most likely 32Mb of eDRAM on the gpu compared to the 10Mb in the 360, this is just rumour but its very likely as it allows for 4 frame of 1080p renders or 8 frames of 720p, this kind of buffer allows fast read and writes and touchups, won't be able to store textures. This allows for basic antialising to be much less demanding. The 1GB of usable system memory will allow decent textures and also less loading times.



#136596 Wii U 'Has A Horrible, Slow CPU' Says Metro Last Light Dev

Posted by esrever on 22 November 2012 - 11:41 PM in Wii U Hardware

I guess you didnt know? The wii u cpu doesnt run the os and apps like its browser, its arm coprocessor does. You are comparing an arm to an arm, not a power derivitive.

But seriously dude, even at 24nm a subwatt processor for a battery device is NOT going to be outmuscling a walled device taking 15 watts unless there is a huge gap in time. Thats just reality man.

Like the dedicated sound chip, Nintendos system design removes all unecessary loads from the cpu.

Although 180% is nothing. Audiokinetic acheived performance increases of 500% from optimizations to their wwise 2012 v2 middleware.

The who system draws 35W, the gpu is more than 5x the size of the cpu. Therefor the cpu should draw about 5 watts. Where is it written that the Wii has a ARM core for the OS? I don't think I read that from any release information unless its just rumor. 180% increase from working software is a great challenge. The software would have to be in alpha phase to consider such improvements.

I haven't been following, but I can probably assume correctly from the last post that this thread has degraded into a bitter war over the small things.

To put it in the most basic way possible, the most BASIC way, there are developers out there who are finding the CPU underwhelming. It may or may not be the case, that this fact will affect certain game production toward the Wii U. Worst case scenario would be particular games not reaching the Wii U because of it. That's not to say that the Wii U is a flop, just that different development methods need to be adopted to make things work. Remember, that not every developer is complaining about this feature, nor are they predicting the Wii U's doom because of it alone.

developers aren't ussually ones to talk much about things like this, it took a long time before it was comon knowledge how hard it was to program for the ps3. Its not all doom and gloom since nintendo will deliver good first party games most likely and there will probably be good 360 and ps3 ports eventually.



#136593 PotatoHog's daily lesson on the Wii U CPU/GPU situation

Posted by esrever on 22 November 2012 - 11:34 PM in Wii U Hardware

Interesting food for thought. I'd like to revisit the topic of ports when the reverse happens. Let's see what happens when a game is made from the ground up on the Wii U and then ported over to the PS3/X360. If the Wii U is no more powerful than the X360/PS3, then the differences should be pretty insignificant.

The difference would probably in the resolution and textures but generally, the game play would be the same and differences won't be extreme. Would be like comparing an early ps3 and 360 port. If the CPU is up to the task, the gpy load can be easily lowered to compensate.



#136413 Slow CPU Will Shorten Wii U's Life - DICE Dev

Posted by esrever on 22 November 2012 - 03:08 PM in Wii U Hardware

Given how the PS3 performed, I can't blame Nintendo. You've made a lot of assumptions to this point, and now you're saying that since the Wii U is backwardly compatible with the Wii games, the CPU must be based on the same architecture. However according to IBM, the POWER processors include the PowerPC instruction set, which would explain how the Wii U is able to run Wii games.

In fact, IBM has said many times that the Wii U's CPU is a Power-based, not PowerPC.

power-based includes powerPC as well as all the other power derivatives. We know its not a power7 based on IBM denying it. I have made a lot of assumptions but given situation its more likely than not. There is really no reason for IBM to develope a custom POWER cpu as the extra bits of the ISA isn't useful here. PowerPC contains all the instructions you'd need in a console, POWER just has a lot of memory management and also security.



#136407 Wii U 'Has A Horrible, Slow CPU' Says Metro Last Light Dev

Posted by esrever on 22 November 2012 - 03:00 PM in Wii U Hardware

Ha ha oh wow. A sub watt arm v7 processor derivitive.

Some of you guys actually believe this dont you?

Wow.

32nm chip vs a 45nm chip. With a modern and efficient ISA and caching. The chip itself is simply much more modern.
Posted Image

This is the only possible benchmark so far. The software can increase efficiency but the Wii u will need a 180% increase in performance to match the iphone 5.



#136404 Slow CPU Will Shorten Wii U's Life - DICE Dev

Posted by esrever on 22 November 2012 - 02:52 PM in Wii U Hardware

The part I highlighted is of interest because isn't that the same issue facing the Wii U?

Lastly, it's true that both chipsets are made by IBM, but separated by at least 7 years. It's hard to believe that the Wii U's CPU designed by IBM is outperformed by a 7-year old CPU from the same company. However, it definitely makes sense that software optimized for an older architecture doesn't run very well on a newer architecture that is completely different in design and philosophy. So obviously, software itself can make a huge difference in how the hardware performs as much as the hardware itself (as you mentioned).

The issue here is the cpu doesn't actually have very much resource. There are definately optimization gains but not to the point where it would suddenly make the cpu more powerful than that of the 360. Judging by the size of the chip, its very likely that nintendo only paid a little as possible and only have a cheap but sufficience CPU for current games as well as being fully backward compatible. It is most likely based on the Wii CPU because the size is too small for indirect emulation.

It would cost a lot of money for nintendo to pay for a completely new design from IBM like sony paid for the CELL. IBM has completely forgone hardware developement for small systems such as PCs and consoles in the recent years. They only focus on large hot and powerful server chips and such. The power7 chips are 200 watt chips that would never work in a console and would be very hard to scale down well enough. The cpu most likely contains the same power ISA as the Wii but with a more modern instruction decode and caching. Thus it probably isn't something completely modern.


It doesnt make up for the slow main ram, its called a memory heiarchy, look it up, having faster main ram would be absolutely pointless and a waste of money because no ram in EXISTANCE can hope to match the bandwidth and low latency of edram performing as 6tsram, what the edram DOESNT have, is capacity, which is where the main ram comes in. Its a BUCKET, not performance ram.

And yes, that is EXACTLY what it does for performance, or did you miss the OFFICIAL IBM DOCUMENTATION attached to my post that says it WORD FOR WORD.

The eDram is a buffer just like the cache, cache would be faster so it would be higher level. The eDram would only be there to buffer main memory so it would be used as thus.



#136392 Wii U 'Has A Horrible, Slow CPU' Says Metro Last Light Dev

Posted by esrever on 22 November 2012 - 02:35 PM in Wii U Hardware

Agreed. surely they must have tested the machine, in different configurations, whilst developing it? or possibly looked at what microsoft and sony had done in the 360 and ps3? saying that Ninty must know how to make consoles, they've been doing it long enough? they must know that one weak component can cause all sorts of woe to a system? but i suppose until we know what the actual specs are for the cpu etc, we don't know quite how weak it is / isn't?

Nintendo wants backward compatibility and a cheap cpu. The cpu inside an iphone 5 is almost definately faster than the one inside the wii u. Nintendo went with the console that cost the least to make that can do 720p HD gaming.



#136294 Slow CPU Will Shorten Wii U's Life - DICE Dev

Posted by esrever on 22 November 2012 - 12:14 PM in Wii U Hardware

Its no under powered... AMD makes graphic cards that can boost PC performance in terms of raw power yes its a gpu that doubles up like a CPU. Im for sure that technology is inside the wii u. Take a look at the insides the gpu and CPU are on one chip basically making it process incredibly fast compared to have to go from CPU to mb to bridge to gfx card to gpu thus it will not be under powered by any means

This can but done but is very limited, only some things can be done on the gpu portion. Game engines have to be completely redone to use this so I wouldn't count on 3rd party devs to use it much.



#136268 Slow CPU Will Shorten Wii U's Life - DICE Dev

Posted by esrever on 22 November 2012 - 12:01 PM in Wii U Hardware

THANK YOU. I actually GAVE somebody the cpu size and transistor density information yrsterday hoping they had the knowledge to make a decent argument with it, and they COMPLETELY ignored it, and continued circle jerking baseless hearsay.

Nintendos not keeping TOO many secrets this time. And what little they are was given away a long time ago by partners like ibm.

So, we KNOW Nintendos cpu is transistor starved because of its size right? Physics dictate you can only have so many objects of the same size in a given space. So HOW are devs who are actually MAKING wii u games getting the cpu performamce out of the system that they are? Seriously, look at wonderful 101, that is a LOT. Nano assault is only using one core, and has thousands of simultaneous characters and objects on screen simultaneously. HOW?

Clock speed and even cores is no longer the main impediment to performance, (thanks to amdahl, we cant really do much more no matter HOW many cores we add) ram latency is.

The wii u cpu has embedded ram. But not just any embedded ram, as shown in ibm's original press release, uses ibm's newest embedded ram technology. NO Its not l3, thats something else. Its the l1.

http://www-03.ibm.co...lease/34683.wss

' The all-new, Power-based microprocessor will
pack some of IBM's most advanced technology
into an energy-saving silicon package that will
power Nintendo's brand new entertainment
experience for consumers worldwide. IBM's
unique embedded DRAM, for example'

Anybody can make edram. Only ibm can license this stuff. Its edram thats bussed six ways each cell, meaning each cell performs like six transistor sram (6tsram, remember cubes blazing fast 1tsram back in the day? Yeah, 6tsram. Its the technology that made power 7 possible. Power 7 only has HALF the transistors as i7, yet it doesnt just fompete with i7, it beats it. This edram is the reason why. It allows the transistors to effectively perform as over 2x their number. Power 7 only has 1.2 billion transistors (only, lol) However the unique edram allows it the equivilant performance of a 2.7 billion transistor chip. THATS how Nintendo managed such a small cpu.

Read it from the official power 7 documentation I have attached that details ibm's (then) new unique edram technology.

From what I understand, the embedded ram inside the Wii U cpu is there to make up for the slow memory as well as the lack of a lot of cache. Although IBM's cpus integrating ram onto the cpu allows for more performance, it does not allow so much more performance per transistor as that. We have very little information on what archetecture is inside the Wii U but it should be a powerPC derivative like the 360's. It is very unlikely that the chip can perform better than the 360's due to the die size limitations even with the embedded ram because the ram itself take transistors and RAM can't be stacked as dense as cpu or cache transistors, however much embedded ram is in the chip would therefor have to cost die area as well. It ends up evening out somewhat from the chip design perspective. Embed ram in servers make sense due to the need for a lot of cache and buffers to main memory, its performance increase in a console probably isn't significant.

If the third die is embed ram, then we are probably looking at <4MB of ram there just from the area, it could allow for optimizations but it is not a complete game changer.

I've never heard of transistor count being a barometer as to the performance of current CPU's; and to my knowledge, no one knows the transistor count of the Wii U's CPU. Aren't you jumping the gun a bit by stating that the X360 has more? Secondly, I was looking at the list of current CPU's and notice that AMD has a few desktop CPU's with way more transistors than Intel's Core i7, but the i7 still mops the floor in most benchmarks.

With that being said, a higher transistor account in itself cannot be reliably used when comparing CPUs, right?

I'm no hardware expert, so my ally is just common sense.

It is true that transistor count is the best way to judge performance but it is a very good way to estimate performance.

http://en.wikipedia.org/wiki/Transistor_count
you can see that the newer chips have higher transistor count and if you go and check performance, it pretty much scales up just like that. It is a rough estimate but its a good guess.

AMD's cpus have a few flaws in them so it very hard to get at the performance of the transistors, the peak performance of AMD's current CPUs are as much if not more than intel's CPUs with the given transistor budget but given the software constrains and the fact that most software does not optimize for AMD's platforms result in bad showing for the chips. Comparing chips from different companies is also much harder to do as they have different ways of doing things. This is why the 360's cpu is such a good candidate for comparison, it was designed by IBM as well as the Wii U's cpu. This makes it a better comparison than Intel vs AMD.

Higher transistor count alone can't be 100% accurate but it should be able to estimate within a certain range of probability. Task specific optimizations are often there as well to skew performance but this does not change the game completely. For a general processing unit, the transistor budget is one of the most important part of designing the chip to be able to hit performance requirements.



#135938 Wii U 'Has A Horrible, Slow CPU' Says Metro Last Light Dev

Posted by esrever on 22 November 2012 - 01:35 AM in Wii U Hardware

4A doesn't have the money to port their game. The CPU is bad but probably isn't going to stop current gen games from running well. This will change when the next gen comes.



#135937 Slow CPU Will Shorten Wii U's Life - DICE Dev

Posted by esrever on 22 November 2012 - 01:28 AM in Wii U Hardware

Whenever I hear this, I keep asking "Why doesn't anyone step forward and explain why this is the case?"  They say the Wii U CPU is underpowered, but underpowered compared to what?  I'm still waiting on a technical breakdown of the Wii U with an explanation of why they think the way they do.  Show us where Nintendo fell short with the CPU in relation to what's needed in developing today's console games so we can have a meaningful discussion.

Also, look at the first thing this guy says “I don't actually know what makes it slow"

At that point, he should shut up.  If HE doesn't know then all he is doing is gossiping.  Correct?

I will explain some basics. This is rough estimations.

Inside any cpu or digital device is transistors which make all the logic happen, generally the more transistor you can fit into a cpu the more performance it has provided you do it logically. Transistors at any fabrication process are spread out in such that you can only fit so many transistor in an area, this is called transistor density of cpu. The Wii U cpu use a 45nm fabrication  process very simular to the way the cpu in the 360 slim is made. The smaller the process the more transistors you can fit into the same area. Current intel cpus are made using a 22nm process. Moore's Law allows has show that performance increases almost linear with the number of transistor you can put in.

The Wii U cpu is 32.76mm^2 as measured by anandtech, the xbox 360 cpu is about 88 mm^2 now using the same process as the Wii U, so transistor density should be about the same. The 360's cpu is 2.7 times the size of the Wii U's, meaning there is probably more than 2x the transistors inside the 360's cpu than the Wii U. This does not mean that the Wii U's cpu is only 50% the performance of the xbox 360 but it gives a good indication that the performance is significantly less. This estimate is probably off by as much as 50% but even then it would put the cpu less powerful than the 360's. So the performance of the Wii U's cpu when doing general processing tasks and doing AI, physics and memory accessing will be slower. I would probably think the Wii U's cpu is only about 80% of the performance as the 360 in gaming. This makes it very hard to port a game without a lot of work, the good thing is the cpu isn't the limiting factor inside the 360 most of the time so it won't show up too much if the developers are porting well.

The GPU inside the Wii U is about 2x as fast as the 360. This will allow Wii U to do games at almost 2x the graphics details. GPGPU can't be easily added to existing game engines. It is very different than coding for just a cpu and gpu, it will take some time to get used to. The 360 also has gpgpu functions. This will not solve all the processing limitations in the long run.

Overall the balance isn't great, Nintindo's console will probably be able to play current gen games but just that. With the next generation of consoles, the Wii U will be like the Wii. I wouldn't be surprised if an Wii U emulator be made by the time the nextbox and ps4 come out for the PC.

These are all my estimates and from my experience as a person with a little cpu design experience. The overall result shouldn't be too far off but can still have a pretty large margine of error as nintendo is keeping all the detail secrete.




Anti-Spam Bots!