Jump to content


Photo

Fundamental Differences Between the Wii U and Current Gen Consoles


  • Please log in to reply
11 replies to this topic

#1 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 27 October 2013 - 07:28 PM

http://www.eurogamer...r-game-creators

 

This is a great article that really gives games some insight on what the new hardware really means, and what developers did this previous generation to get the results they were able to achieve.  I will highlight a few areas that I thought really showcased some fundamental differences between current gen hardware and the Wii U. 

 

All of the current-gen consoles have quite underpowered GPUs compared to PCs, so a lot of time and effort was spent trying to move tasks off the GPU and onto CPU (or SPUs in the case of PS3). This would then free up valuable time on the GPU to render the world. And I should point out that for all of Xbox 360's GPU advantage over PS3's RSX, at the end of the day, the balance of the hardware was still much the same at the global level. Occlusion culling, backface culling, shader patching, post-process effects - you've heard all about the process of moving graphics work from GPU to CPU on PlayStation 3, but the reality is that - yes - we did it on Xbox 360 too, despite its famously stronger graphics core.

 

 

This previous generation was kind of an odd one.  The CPU's happened to be much better at working in tandem with the GPU doing graphics calucations than they were with doing traditional CPU task.  Developers were leaning very heavily on the flops performance of the CPU to allow them to create better looking graphics.  This is most likely why a developer like Frozenbyte immediately had better results with the Wii U than current gen system, they run Trine 2 almost exclusively on the GPU, and ask very little from the CPU.  Now for a developer who has been offloading as much work as possible from the GPU to the CPU for the past 5-7 years may not find Wii U development a walk in the park.  The Wii U's cpu is great at traditional cpu task, much better than the 360 and PS3, but that doesnt really matter because developers havent pushed the boundaries in those areas over this previous generation, they were using the CPU to aid the GPU in delivering the best possible graphics and sitting mostly idle in simulation aspects of games, something that the Wii U's cpu could do much better.  Asking the Wii U to do a bunch of graphics processing is not something its going to excel at, and is probably another reason third party ports have been sketch over the past year.  With Wii U developers are going to have to go back to square one and treat the Wii U's cpu like a cpu, and task it appropriately, and treat the gpu like a gpu, and task it appropriately.  If you happen to have a cpu intensive game and arent pushing envelope in the graphics department, then perhaps learning to use the GPGPU capabilities of the GPU are in order. 

 

Although these GPUs are not as fast on paper as the top PC cards, we do get some benefit from being able to talk directly to the GPUs with ultra-quick interconnects. But in this console generation it appears that the CPUs haven't kept pace. While they are faster than the previous generation, they are not an order of magnitude faster, which means that we might have to make compromises again in the game design to maintain frame-rate.

 

 

Some member have said it before and I am starting to agree with them, but porting a PS4 or X1 game might actually be easier than porting a 360/PS3 game.  These new consoles are delivering far more powerful GPU's than current gen, but the CPU's have not advanced anywhere near the same level.  So with these new consoles your going to see developers move away from tasking the cpu with graphics task like they did the previous generation, and go back to a more traditional workload assignment.

 

To really get the most out of the Wii U its a very fundamental difference than on current gen consoles.  The Wii U may not be able to really surpass the current gen consoles in the graphics department by a large margin, they were using the CPU and GPU to get the results they achieved, something that just wouldn't work well with the Wii U architecture.  Its going to be in other areas over the next few years  that we really see the Wii U outclass the current gen consoles by a large margin.       



#2 Alex Atkin UK

Alex Atkin UK

    Boo

  • Members
  • 528 posts

Posted 27 October 2013 - 07:51 PM

That is also why I claimed a few times that "perhaps" some games on PS4/Xbone CAN'T be ported to Wii U.

 

For example, if they used 10% of those consoles GPU for GPGPU that would potentially work out as much as the entire Wii U GPU.

 

Now how likely it is that they would do that, I do not know, but its certainly theoretically possible.

 

Its one thing to scale graphics down to fit a less powerful GPU, but once they start relying on GPGPU I doubt that will scale quite the same way as AI, physics, etc may be tied the how the whole game works.  It just wouldn't be worth hobbling the PC, PS4 and Xbox One versions of a game "just" to make it work on Wii U.  Of course, they MIGHT make a Wii U exclusive version from the ground up, but the above article suggests they would not want to do that.  Besides, the Wii multi-platform games which did that were largely panned for being completely different to the 360/PS3 versions.


Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK

 

How to improve the Wii U download speed.


#3 GAMER1984

GAMER1984

    Lakitu

  • Members
  • 2,036 posts
  • NNID:gamer1984
  • Fandom:
    Nintendo

Posted 27 October 2013 - 08:32 PM

That is also why I claimed a few times that "perhaps" some games on PS4/Xbone CAN'T be ported to Wii U.
 
For example, if they used 10% of those consoles GPU for GPGPU that would potentially work out as much as the entire Wii U GPU.
 
Now how likely it is that they would do that, I do not know, but its certainly theoretically possible.
 
Its one thing to scale graphics down to fit a less powerful GPU, but once they start relying on GPGPU I doubt that will scale quite the same way as AI, physics, etc may be tied the how the whole game works.  It just wouldn't be worth hobbling the PC, PS4 and Xbox One versions of a game "just" to make it work on Wii U.  Of course, they MIGHT make a Wii U exclusive version from the ground up, but the above article suggests they would not want to do that.  Besides, the Wii multi-platform games which did that were largely panned for being completely different to the 360/PS3 versions.


Wii situation and Wii U situation is totally different. WiiU can run anything xb1 and ps4 can... Yes some changes will have to be made but not changes that make the game unplayable.

#4 Mewbot

Mewbot

    I'm batman

  • Members
  • 2,027 posts
  • NNID:R00bot
  • Fandom:
    Legend of Zelda and Super Smash Bros.

Posted 28 October 2013 - 12:35 AM

Wii situation and Wii U situation is totally different. WiiU can run anything xb1 and ps4 can... Yes some changes will have to be made but not changes that make the game unplayable.

So, this means there'll be third party support? OH WAIT! As I've said before, third party support has never been and will never be based on power.



It's bloody annoying, too, because if Nintendo had third party support then they would destroy.


Y U READ THIS?...WHY IS THERE TEXT HERE? LOL WTF
       bi5tzqg.gif
 

                                 Wii U ID : R00bot


#5 Nollog

Nollog

    Chain Chomp

  • Banned
  • 776 posts
  • NNID:Nollog
  • Fandom:
    Creepy Stalker Girl

Posted 28 October 2013 - 02:34 AM

With Wii U developers are going to have to go back to square one and treat the Wii U's cpu like a cpu, and task it appropriately, and treat the gpu like a gpu, and task it appropriately.  If you happen to have a cpu intensive game and arent pushing envelope in the graphics department, then perhaps learning to use the GPGPU capabilities of the GPU are in order.

Lernding how to code properly for a consoles?! int he 2,000's?! Nope.

That is also why I claimed a few times that "perhaps" some games on PS4/Xbone CAN'T be ported to Wii U.
 
For example, if they used 10% of those consoles GPU for GPGPU that would potentially work out as much as the entire Wii U GPU.

10% eh?
I didn't realise the x1 and p4 used the new titan crossfired 3 times.
I thought they used an amd 7770 or somewhere around that, which is about 20% more powerful than a 4800 what Nintendo are rumouredly basing their gpu off.

Edited by Nollog, 28 October 2013 - 02:36 AM.

Warning: Cannot modify header information - headers already sent by (output started at /home/thewiiu/public_html/ips_kernel/HTMLPurifier/HTMLPurifier/DefinitionCache/Serializer.php:133) in /home/thewiiu/public_html/ips_kernel/classAjax.php on line 328
{"success":1,"post":"\n\n
\n\t\t<\/a>\n\t\t\n\t\n\t\t\n\t\t
\n\t\t\t\n\t\t\t\t


#6 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 28 October 2013 - 03:56 AM

lol. This is just getting pathetic now. Not that its not a decent article, it actually is, its just the medias changing behavior, and how transparantly its changed now that xbone and ps4 arent true next gen leaps like their predecessors.

Suddenly, they are championing all the information they used to mock because its for xbone and ps4 instead of wii u....

banner1_zpsb47e46d2.png

 


#7 NintendoReport

NintendoReport

    NintendoChitChat

  • Moderators
  • 5,907 posts
  • NNID:eddyray
  • Fandom:
    Nintendo Directs and Video Presentations

Posted 28 October 2013 - 05:24 AM

lol. This is just getting pathetic now. Not that its not a decent article, it actually is, its just the medias changing behavior, and how transparantly its changed now that xbone and ps4 arent true next gen leaps like their predecessors.

Suddenly, they are championing all the information they used to mock because its for xbone and ps4 instead of wii u....

 

Yea I agree. The media has to change or twist things around a bit as the next gen games don't lie.


Keep Smiling, It Makes People Wonder What You Are Up To!
PA Magician | Busiest PA Magician | Magician Reviewed | Certified Magic Professionals

nccbanner_by_sorceror12-d9japra.png-- nintendoreportbox.png -- nintendo_switch_logo_transparent___wordm

#8 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 28 October 2013 - 05:32 AM

lol. This is just getting pathetic now. Not that its not a decent article, it actually is, its just the medias changing behavior, and how transparantly its changed now that xbone and ps4 arent true next gen leaps like their predecessors.

Suddenly, they are championing all the information they used to mock because its for xbone and ps4 instead of wii u....

 

To me the funniest part was that they were basically admitting that the simulation (gameplay aspects) took a back seat to creating better looking graphics.  There is a reason the CPU design of the Cell and the tri core PPC in the 360 arent the foundation for any of the latest and greatest cpu's today.  For a CPU, they were actually very good at graphics processing, and sucked at what a CPU is supposed to do well.  This actually aided there ability to do better graphics rendering for this generation, but it left developers with very little headroom to really expand on the gameplay simulation aspect of games. 

 

In this console generation it appears that the CPUs haven't kept pace... which means that we might have to make compromises again in the game design to maintain frame-rate

 

 

Its not so much that CPU's haven't advanced, I think the I7 would like to say hi, but that the way developers have learned to create games that are far more demanding for the GPU than they are the CPU, so it just makes sense these days to go with the best GPU you can afford.  Back in 2005 this wasn't really an option, the GPU in the 360 was pretty much cutting edge for the time. 

 

As for the idea of next gen games using the GPGPU functionality on the PS4 and X1 to such a high degree that it could make a Wii U port impossible....Sure, I suppose that could happen.  You really have to question if a publisher is willing to fund a game that will sacrifice the graphics fidelity in order to really expand the gameplay simulation of a given genre.  Lets be real here, they are struggling to make 60fps 1080p happen without loading up the GPU with a bunch of general processing work, how many developers are going to give up cycles on the GPU for that type of work?  I wouldn't expect to see a ton of cycles eaten up for general processing, not saying some developers wont use the functionality, but to think that your going to use more than 10% of the GPU's cycles for those operations is pretty unrealistic. 



#9 Alex Wolfers

Alex Wolfers

    Thy Fur Consumed

  • Members
  • 2,768 posts
  • NNID:AxGamer
  • Fandom:
    Furry Fandom,gaming,trolling

Posted 28 October 2013 - 09:37 AM

Wii situation and Wii U situation is totally different. WiiU can run anything xb1 and ps4 can... Yes some changes will have to be made but not changes that make the game unplayable.

Ugh crappy ports. At least the Wii U will have some neat origional content.


Signature_DK.png


#10 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 28 October 2013 - 10:29 AM

http://arstechnica.c...php?f=2&t=13802

 

some game developer comments (on the record and off the record) have Xenon's performance on branch-intensive game control, AI, and physics code as ranging from mediocre to downright bad. Xenon will be a streaming media monster, but the parts of the game engine that have to do with making the game fun to play (and not just pretty to look at) are probably going to suffer. Even if the PPE's branch prediction is significantly better than I think it is, the relatively meager 1MB L2 cache that the game control, AI, and physics code will have to share with procedural synthesis and other graphics code will ensure that programmers have a hard time getting good performance out of non-graphics parts of the game.

 

 

Its like game developers spent the last 7 years identifying what the 360 and PS3 do well while actively avoiding pitfalls.  Graphics are king in todays world, and although Wii U does make a step up above the current gen consoles, its making a much bigger leap on the potential for better AI and Physics.  The expresso has tons of cache and short pipelines, its basically really good at everything developers have worked around for years.  Huge cache with a short pipeline is going to make the expresso a very predictable cpu. 

 

The Cell has only one PPE to the Xenon's three, which means that developers will have to cram all their game control, AI, and physics code into at most two threads that are sharing a very narrow execution core with no instruction window. (Don't bother suggesting that the PS3 can use its SPEs for branch-intensive code, because the SPEs lack branch prediction entirely.) Furthermore, the PS3's L2 is only 512K, which is half the size of the Xenon's L2. So the PS3 doesn't get much help with branches in the cache department. In short, the PS3 may fare a bit worse than the Xenon on non-graphics code, but on the upside it will probably fare a bit better on graphics code because of the seven SPEs.

 

 

You can darn near come to the conclusion that both the CPU and GPU were there for graphics processing on these consoles.  Dont bother pushing the limits in any non graphics code, cause they will fall flat on their face.  The Wii U has so much potential in this area compared to current gen. 


Edited by Goodtwin, 28 October 2013 - 10:33 AM.


#11 Alex Atkin UK

Alex Atkin UK

    Boo

  • Members
  • 528 posts

Posted 28 October 2013 - 10:58 AM

As for the idea of next gen games using the GPGPU functionality on the PS4 and X1 to such a high degree that it could make a Wii U port impossible....Sure, I suppose that could happen.  You really have to question if a publisher is willing to fund a game that will sacrifice the graphics fidelity in order to really expand the gameplay simulation of a given genre.  Lets be real here, they are struggling to make 60fps 1080p happen without loading up the GPU with a bunch of general processing work, how many developers are going to give up cycles on the GPU for that type of work?  I wouldn't expect to see a ton of cycles eaten up for general processing, not saying some developers wont use the functionality, but to think that your going to use more than 10% of the GPU's cycles for those operations is pretty unrealistic. 

 

I will try to find the source but I'm pretty sure I read that the reason for the PS4 having so many CUs was that in fact that around 11 is the balance for the graphics, the other 7 were put there purely for GPGPU use.  The idea being that trying to use more for the graphics would hit bottlenecks in the rest of the system so not really be much use.

 

Obviously they aren't using all that potential on games which come out on Xbox 360 and PS3 as well, but they would be crazy not to for games not targeting those consoles.  That is when I can see supporting the Wii U as well would hold them back.


Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK

 

How to improve the Wii U download speed.


#12 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 28 October 2013 - 11:22 AM

I will try to find the source but I'm pretty sure I read that the reason for the PS4 having so many CUs was that in fact that around 11 is the balance for the graphics, the other 7 were put there purely for GPGPU use.  The idea being that trying to use more for the graphics would hit bottlenecks in the rest of the system so not really be much use.

 

Obviously they aren't using all that potential on games which come out on Xbox 360 and PS3 as well, but they would be crazy not to for games not targeting those consoles.  That is when I can see supporting the Wii U as well would hold them back.

 

 

Sounds reasonable to me.  It would make sense seeing as how no matter how you slice it, the jaguar simply doesnt stack up to an I7 or even I5 processor.  When you look at PC benchmarks for GPU's, they almost always use a I7 cpu to make sure they are not limiting the GPU in any possible way.  As it stands the GPU and CPU are pretty disproportionate, but if developers really take advantage of the GPGPU, then the PS4 and X1 could be very well rounded machines. 






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

Anti-Spam Bots!