Jump to content


Photo

Exactly how much of a downgrade would a PS4 game suffer on Wii U?


  • Please log in to reply
60 replies to this topic

#41 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 12 September 2013 - 10:20 PM

Well done ogre.
I know what blink is.
I want to know what makes blink 2 such a super duper feat.
It's just a video player/codec as I understand it.


Bink 2 is 75% simd instructions.

http://en.wikipedia.org/wiki/SIMD

Again, simd is like a factory of little cpu's all working on different little parts of the same thing.

This enabled bink 2 to be I think 12x now? Faster than bink 1. And since it was so fast now, it allowed them to add real time compression artifact removal via pixel shaders and other boons. All hugely simd dependent. If you dont have the factory, you stuck with bink 1.

As long as you have modern simd on your processor, its mostly easy as pie (although some people congfigure it wrong and it destroys their performance as it goes through the cpu).

Even arm processors... provided you have neon for simd.

Espresso has no 'modern' simd. The g3 750 series is a non simd cpu. The g4 series was a 750 with an altivec bolted on for simd... But nintendo decided to go their own route.

Of course it has simd for nintendo's systems, nintendo added 50 something new simd instructions... paired singles: iirc 2x32 bit simd around 2Gflops for single precision.

It had to to crunch geometry, it was a game system, so it was necessary. And it kicked butt pretty damn good back in the day. Cube is still king of the hill for in game polygon counts of its generation. By a lot.

But a lot has happened since then. Apple and ibm came up with altivec, then on the intel side we had sse2 and 3....

Yet Nintendo's cpu's simd's have remained the same, 2x32 bit simd the paired singles. wii's clock increase boosted flop count a bit. As did espressos near 2x increase, cache increase, and now with 3 cores (bink 2 only uses 2) But, as far as we know, and rad tools seems to support this, its still the same simd from the gamecube... Not sse3 or even 2, or even altivec as the rest of the world now uses. Its apparantly so removed from modern simd paralell computing engines rad tools calls it a 'non simd cpu'.

I didnt see an issue with this, as the days of the cpu handling all or even the bulk of geometry crunching should be over, and it certainly looks like nintendo intended that to be handled by the graphics unit with wii u (as opposed to 360, and especially ps3, using the flop power of their cpu's to generate geometry). So this 'weak' simd didnt seem like an issue to me. It was more than enough for what a cpu needed in that situation.

But apparantly, the performance has been improved considerably via the upclock and cache increases. Enough to surprise rad game tools.


So here, we have this cpu with 'no simd' as the people making the product see it, running a brand new very simd demanding, designed specifically to function via simd, media application.

Its surprising it packs such a punch in an area its supposed to be very weak in. Its not 'supposed' to really pack any punch here. Its surprising nintendo and ibm's solution back in the gamecube era is upscaling to be this competent today, in this particular area. It was really a fantastic design.

banner1_zpsb47e46d2.png

 


#42 GAMER1984

GAMER1984

    Lakitu

  • Members
  • 2,036 posts
  • NNID:gamer1984
  • Fandom:
    Nintendo

Posted 12 September 2013 - 10:39 PM

Bink 2 is 75% simd instructions.

http://en.wikipedia.org/wiki/SIMD

Again, simd is like a factory of little cpu's all working on different little parts of the same thing.

This enabled bink 2 to be I think 12x now? Faster than bink 1. And since it was so fast now, it allowed them to add real time compression artifact removal via pixel shaders and other boons. All hugely simd dependent. If you dont have the factory, you stuck with bink 1.

As long as you have modern simd on your processor, its mostly easy as pie (although some people congfigure it wrong and it destroys their performance as it goes through the cpu).

Even arm processors... provided you have neon for simd.

Espresso has no 'modern' simd. The g3 750 series is a non simd cpu. The g4 series was a 750 with an altivec bolted on for simd... But nintendo decided to go their own route.

Of course it has simd for nintendo's systems, nintendo added 50 something new simd instructions... paired singles: iirc 2x32 bit simd around 2Gflops for single precision.

It had to to crunch geometry, it was a game system, so it was necessary. And it kicked butt pretty damn good back in the day. Cube is still king of the hill for in game polygon counts of its generation. By a lot.

But a lot has happened since then. Apple and ibm came up with altivec, then on the intel side we had sse2 and 3....

Yet Nintendo's cpu's simd's have remained the same, 2x32 bit simd the paired singles. wii's clock increase boosted flop count a bit. As did espressos near 2x increase, cache increase, and now with 3 cores (bink 2 only uses 2) But, as far as we know, and rad tools seems to support this, its still the same simd from the gamecube... Not sse3 or even 2, or even altivec as the rest of the world now uses. Its apparantly so removed from modern simd paralell computing engines rad tools calls it a 'non simd cpu'.

I didnt see an issue with this, as the days of the cpu handling all or even the bulk of geometry crunching should be over, and it certainly looks like nintendo intended that to be handled by the graphics unit with wii u (as opposed to 360, and especially ps3, using the flop power of their cpu's to generate geometry). So this 'weak' simd didnt seem like an issue to me. It was more than enough for what a cpu needed in that situation.

But apparantly, the performance has been improved considerably via the upclock and cache increases. Enough to surprise rad game tools.


So here, we have this cpu with 'no simd' as the people making the product see it, running a brand new very simd demanding, designed specifically to function via simd, media application.

Its surprising it packs such a punch in an area its supposed to be very weak in. Its not 'supposed' to really pack any punch here. Its surprising nintendo and ibm's solution back in the gamecube era is upscaling to be this competent today, in this particular area. It was really a fantastic design.

 

 

i just cant wait to see the game that really pushes the Wii U over the hill. i think the Zelda U is going to be the first OMG no way this is running on Wii U game. patience i guess becuase i just dont see a 3rd party publisher really trying to push out the best graphics possible for nintendo gamers.



#43 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 12 September 2013 - 10:51 PM

i just cant wait to see the game that really pushes the Wii U over the hill. i think the Zelda U is going to be the first OMG no way this is running on Wii U game. patience i guess becuase i just dont see a 3rd party publisher really trying to push out the best graphics possible for nintendo gamers.


My guess is its going to come from monolithsoft with X. Probably a year ortwo before the next zelda (which they'll help with again).

Reading that Iwata asks back in january, really paints them as the best prepared at nintendo right now.

Before nintendo bought them, they had heavily invested r&d into hd technology, as they were a 3rd party.

They stated they were quite eager to show the fruits of that labor, excitedly talking about physics and shaders.

They are demanding top class talent in high end fields by the bucket full.... They are probably the largest team at nintendo right now by a considerable margin.

And looking at the night and day difference between the X trailers that were only a few months apart.... Im dying to see what it looks like now.

Although shin en's new engine they are working on will most likely be pretty awesome too.

banner1_zpsb47e46d2.png

 


#44 GAMER1984

GAMER1984

    Lakitu

  • Members
  • 2,036 posts
  • NNID:gamer1984
  • Fandom:
    Nintendo

Posted 13 September 2013 - 03:41 AM

yeah its just hard waiting i know Wii U has a lot more in store from a technical standpoint... and i cant wait to see and experience those games. the games that will make the gaming world take Wii U serious as a "NEXT-GEN" system from a graphical standpoint. its more than just tech specs and polygon pushing... its also artstyle which Nintendo are masters at. right now in the so-called "HD" age Pikmin 3 has been the most enjoyable game from a graphical standpoint i have played. the world just has so much life its a game i could play everyday just because ofthe beauty and seemlessness of the world. it basically has gotten me so excited to see what Nintendo has in store for 2014 and beyond



My guess is its going to come from monolithsoft with X. Probably a year ortwo before the next zelda (which they'll help with again).

Reading that Iwata asks back in january, really paints them as the best prepared at nintendo right now.

Before nintendo bought them, they had heavily invested r&d into hd technology, as they were a 3rd party.

They stated they were quite eager to show the fruits of that labor, excitedly talking about physics and shaders.

They are demanding top class talent in high end fields by the bucket full.... They are probably the largest team at nintendo right now by a considerable margin.

And looking at the night and day difference between the X trailers that were only a few months apart.... Im dying to see what it looks like now.

Although shin en's new engine they are working on will most likely be pretty awesome too.

 

 

question what is the disadvantage Wii U has for it's RAM compared to xb1 and ps4? becuase right now thats the only problem i see with the console. ive done my homework.... yes they are more powerful but ps4 and xb1 arent anything special tech wise. all three consoles are gonna have amazing looking games when created for properly... but we are hearing devs talking about struggling to get games working on ps4 and xb1. the thought from gamers was that xb1 and ps4 "ALL" games were going to be 1080p native and 60fps. those days are long gone that isnt going to happen and if Wii U can hit 720p 30 and 60 fps with most of its games with "next-gen" features from its GPU i think most people are going to fall in love with what it can do.


Edited by GAMER1984, 12 September 2013 - 11:38 PM.


#45 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 13 September 2013 - 09:26 AM

I dont know that the Wii U has any advantage in the memory department over the PS4 or X1, both of those consoles have very high performing memory systems.  Sony went with a straight forward brute force method by going with GDDR5.  Microsoft went with DDR3, but went with a large bus to that ram to maximize bandwidth to that ram, still not nearly as high as Sony, but Microsoft also included 32MB of sram to help alleviate the bandwidth needs to the main memory, I assume this will be used in a similar way to that of the edram in the Wii U.  Nintendo's setup is simply better than it looks on paper.  When you look at the 12.8GB/s to the main memory, you would think that the Wii U is a giant turd compared to even the 360 and PS3 who have nearly double that amount.  The reality is that a ton of the bandwidth requirements are eaten up by frame buffers and not just high resolution textures.  So in my opinion, Nintendo most likely came to the conclusion that there bandwidth to capacity ratio was correct, meaning that when factoring how many high resolution textures can even be stored in the main memory, the bandwidth would be sufficient to supply them to the GPU.  Its all a ratio, there is no reason to have tons of bandwidth for textures if your memory capacity cant hold them.  I think sometimes people underestimate the amount of "research and development" nintendo goes through with their hardware.  So many times people look at the components and bash them for choosing lower end components.  Do these people really believe that Nintendo would cripple the performance of their system that they will be developing software for?  No, through the r&d process, Nintendo test components, and comes up with cost effective ways to get the performance they are targeting. 

 

Like 3Dude said, its damn impressive what an upgraded PPC750CL processor can do.  People like to bash Nintendo for using an older processor design, but these criticisms arent founded when it comes to actual game performance.  We know that the cpu isnt a beast when it comes to flops, no secret there, but whats surprising is that it performs better than expected in that category, but in the general processing category is exceptional.  Not exceptional compared to an I7, but at least on par with the X1/PS4 CPU's. 

 

So in the end when Shin'en says the console is well in balance, I think people need to take that to heart.  Its up to the developers to make sure their code to balanced.  The CPU shouldnt be doing graphics processing while the GPU sits at 80%.  The PS3 used the CPU for some graphics processing because the GPU was maxed and the SPU's on the CPU could pick up the slack.  Developers had to figure that out with the PS3.  Developers who didnt make good use of the SPU's on the CPU were losing a huge amount of performance.  What developers may find with the Wii U is that while doing geometry on the CPU may have made sense on previous gen consoles, it may only cost a couple percent of GPU performance on the Wii U's gpu, but free's up a ton of performance on the CPU.  Its all about overall performance.  Maybe the GPU is capable of running a given game at 50fps, but the CPU limits things to about 20fps.  By moving some of the workload to the GPU now we are bringing things back into balance, so the games can run at a solid 30fps.  They have to find the right balance, and the exclusives will show this, and are showing this.  Multi plat games will improve, and I will be interested to see how Treyarch does with Ghost.  They knew how to work the Wii hardware, I would expect them to at the very least match the 360's level of performance, and hopefully exceed it. 



#46 Dharmanator

Dharmanator

    Thwomp

  • Members
  • 336 posts

Posted 13 September 2013 - 09:36 AM

Indeed.

Look at those screens: http://www.neogaf.co...t=676477&page=4

Compare that with the reveal trailer:

Also, diminishing returns.

I Loved that forum. Some people say it looks great. .. others it is awful. Some said PS3 graphics. That is laughable.

I liked the guy who said picks were 2 builds so.

The downgrade is obvious though and I do not like to be decieved. But, let us see the shipped product before we bash it to much.

Still looks good. .. but yes. ... pretty devious as is.

Edited by Dharmanator, 13 September 2013 - 09:37 AM.


#47 namkotje

namkotje

    Spiny

  • Members
  • 221 posts

Posted 13 September 2013 - 10:13 AM

Still looks good. ..

 Indeed! And it will still looks great! But compare the screen with God of War: Ascension on PS3

 

God-of-War-Ascension-multiplayer.jpg

 

Pretty shows that a Wii U can look good enough, since Wii U is stonger then a PS3. :)


Edited by namkotje, 13 September 2013 - 10:14 AM.

needsthisfinal2_zps12b5fb7d.gif


#48 GAMER1984

GAMER1984

    Lakitu

  • Members
  • 2,036 posts
  • NNID:gamer1984
  • Fandom:
    Nintendo

Posted 13 September 2013 - 12:17 PM

I dont know that the Wii U has any advantage in the memory department over the PS4 or X1, both of those consoles have very high performing memory systems.  Sony went with a straight forward brute force method by going with GDDR5.  Microsoft went with DDR3, but went with a large bus to that ram to maximize bandwidth to that ram, still not nearly as high as Sony, but Microsoft also included 32MB of sram to help alleviate the bandwidth needs to the main memory, I assume this will be used in a similar way to that of the edram in the Wii U.  Nintendo's setup is simply better than it looks on paper.  When you look at the 12.8GB/s to the main memory, you would think that the Wii U is a giant turd compared to even the 360 and PS3 who have nearly double that amount.  The reality is that a ton of the bandwidth requirements are eaten up by frame buffers and not just high resolution textures.  So in my opinion, Nintendo most likely came to the conclusion that there bandwidth to capacity ratio was correct, meaning that when factoring how many high resolution textures can even be stored in the main memory, the bandwidth would be sufficient to supply them to the GPU.  Its all a ratio, there is no reason to have tons of bandwidth for textures if your memory capacity cant hold them.  I think sometimes people underestimate the amount of "research and development" nintendo goes through with their hardware.  So many times people look at the components and bash them for choosing lower end components.  Do these people really believe that Nintendo would cripple the performance of their system that they will be developing software for?  No, through the r&d process, Nintendo test components, and comes up with cost effective ways to get the performance they are targeting. 
 
Like 3Dude said, its damn impressive what an upgraded PPC750CL processor can do.  People like to bash Nintendo for using an older processor design, but these criticisms arent founded when it comes to actual game performance.  We know that the cpu isnt a beast when it comes to flops, no secret there, but whats surprising is that it performs better than expected in that category, but in the general processing category is exceptional.  Not exceptional compared to an I7, but at least on par with the X1/PS4 CPU's. 
 
So in the end when Shin'en says the console is well in balance, I think people need to take that to heart.  Its up to the developers to make sure their code to balanced.  The CPU shouldnt be doing graphics processing while the GPU sits at 80%.  The PS3 used the CPU for some graphics processing because the GPU was maxed and the SPU's on the CPU could pick up the slack.  Developers had to figure that out with the PS3.  Developers who didnt make good use of the SPU's on the CPU were losing a huge amount of performance.  What developers may find with the Wii U is that while doing geometry on the CPU may have made sense on previous gen consoles, it may only cost a couple percent of GPU performance on the Wii U's gpu, but free's up a ton of performance on the CPU.  Its all about overall performance.  Maybe the GPU is capable of running a given game at 50fps, but the CPU limits things to about 20fps.  By moving some of the workload to the GPU now we are bringing things back into balance, so the games can run at a solid 30fps.  They have to find the right balance, and the exclusives will show this, and are showing this.  Multi plat games will improve, and I will be interested to see how Treyarch does with Ghost.  They knew how to work the Wii hardware, I would expect them to at the very least match the 360's level of performance, and hopefully exceed it.


Yeah just sad we are hoping Wii U can get performance on same level as a console that launched in 2005.

#49 Dharmanator

Dharmanator

    Thwomp

  • Members
  • 336 posts

Posted 13 September 2013 - 01:25 PM

All this tech talk is pretty good. Nice to hear the Wii U May not be too far behind.

But what I don't like is the fact almost a year into the life of the Wii U there is no "WOW" game.

Infamous looks pretty sweet. But it is a launch title so we can't expect to much. But it is a good sign. Deep Down still looks promising. KZ too.

Just really sucks it is taking so long. Of course good stuff always takes time. But the stuff you see for the other guys really does take the wind out of the Wii U sails.

Yes, X looks good. .. but when is it coming out? Maybe it will be to late. People will already have made their choices.

#50 Desert Punk

Desert Punk

    Chain Chomp

  • Members
  • 656 posts

Posted 15 September 2013 - 02:31 AM

Another thread of fanboy crap with no connection with the real world.

 

The wii u is in the same area of performance as ps3/360 and they have as much claim to being next gen as wii u does.

 

Again for people stupid enough to read these threads and actually believe them you need to look elsewhere like eurogamer faceoffs, lens of truth etc and comparison reviews in general.

 

As for the comment about the gamecube vs ps2 again you can't just say the gamecube is superior full stop. Each format has advantages and disadvantages. PS2 has 5.1 sound not 2 channel sound, it has 32bit colour not 24bit to enable some good transparency effects, bloom lighting etc, it has full size dvds not mini-dvds. Games like Gran Turismo on ps2 are 1080i. Clearly the Gamecube has other huge advantages over ps2 like a massive more powerful cpu and more powerful gpu in most areas. Certain games will have advantages on different hardware depending on what resources they need.

 

It is so sad to see the denial of reality in this forum. You are so out of sync with real world views of the wii u techically that you see on multiformat forums.

 

The answer to the question posed by this thread is that the few PS4 or xbox one games that get downported to 360, PS3 or wii u will be very similar across 360, PS3 and wii u with a greater chance that they will perform a little bit better on 360 and PS3. As has been the situation so far.  The only real chance of the wii u version performing between 360/PS3 and PS4/Xbone is if its a game that has lower cpu requirements but relies on more memory being available.

 

I hope when Watchdogs comes out and we see how the wii u performs (very close or below 360/PS3) we can stop half the unicorn tongue on these threads.



#51 Poptartboy

Poptartboy

    Spiny

  • Members
  • 249 posts

Posted 15 September 2013 - 03:36 AM

Lol more nonsense from you. Why do you still bother to post such rubbish all the time?

Talk about 'denial of reality'....

#52 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 15 September 2013 - 08:19 AM

Another thread of fanboy crap with no connection with the real world.
 
The wii u is in the same area of performance as ps3/360 and they have as much claim to being next gen as wii u does.
 
Again for people stupid enough to read these threads and actually believe them you need to look elsewhere like eurogamer faceoffs, lens of truth etc and comparison reviews in general.
 
As for the comment about the gamecube vs ps2 again you can't just say the gamecube is superior full stop. Each format has advantages and disadvantages. PS2 has 5.1 sound not 2 channel sound, it has 32bit colour not 24bit to enable some good transparency effects, bloom lighting etc, it has full size dvds not mini-dvds. Games like Gran Turismo on ps2 are 1080i. Clearly the Gamecube has other huge advantages over ps2 like a massive more powerful cpu and more powerful gpu in most areas. Certain games will have advantages on different hardware depending on what resources they need.
 
It is so sad to see the denial of reality in this forum. You are so out of sync with real world views of the wii u techically that you see on multiformat forums.
 
The answer to the question posed by this thread is that the few PS4 or xbox one games that get downported to 360, PS3 or wii u will be very similar across 360, PS3 and wii u with a greater chance that they will perform a little bit better on 360 and PS3. As has been the situation so far.  The only real chance of the wii u version performing between 360/PS3 and PS4/Xbone is if its a game that has lower cpu requirements but relies on more memory being available.
 
I hope when Watchdogs comes out and we see how the wii u performs (very close or below 360/PS3) we can stop half the unicorn tongue on these threads.


*Takes a break from operation 01-2 of Wonderful 101, reads punks desperate fud*

Hmm....

*looks back at operation 1-02*

PPPFFFFFTTTTTT HA HA HA HA HA, HA HA HA HA HA HA!!!!

HHHAAAAA!!!!!! HHHHAAAAA!!!!!!

Oh wait you were serious.

4309230175_6b3417d720.jpg

banner1_zpsb47e46d2.png

 


#53 Nollog

Nollog

    Chain Chomp

  • Banned
  • 776 posts
  • NNID:Nollog
  • Fandom:
    Creepy Stalker Girl

Posted 15 September 2013 - 08:30 AM

Another thread of fanboy crap with no connection with the real world.
 
The wii u is in the same area of performance as ps3/360 and they have as much claim to being next gen as wii u does.

How so?
The 360 and PS3 were released 6 or 7 years ago, a generation before the Wii U.
I'm confused.
silly pony.

Edited by Nollog, 15 September 2013 - 08:31 AM.

Warning: Cannot modify header information - headers already sent by (output started at /home/thewiiu/public_html/ips_kernel/HTMLPurifier/HTMLPurifier/DefinitionCache/Serializer.php:133) in /home/thewiiu/public_html/ips_kernel/classAjax.php on line 328
{"success":1,"post":"\n\n
\n\t\t<\/a>\n\t\t\n\t\n\t\t\n\t\t
\n\t\t\t\n\t\t\t\t


#54 Zinix

Zinix

    YA HOMIE.

  • Members
  • 4,410 posts
  • NNID:zinixzero
  • Fandom:
    The Twilight Zone Fandom

Posted 15 September 2013 - 08:34 AM

Hey Desert Punk is back


“Any state, any entity, any ideology that fails to recognize the worth, the dignity, the rights of man, that state is obsolete.”— Rod Serling, “The Twilight Zone” The Obsolete Man

Smoke meth. Hail Satan. Watch the yearly Twilight Zone marathons. Talk to dead people. Everyone is gay. Ignore people. Live life to the fullest.


#55 GAMER1984

GAMER1984

    Lakitu

  • Members
  • 2,036 posts
  • NNID:gamer1984
  • Fandom:
    Nintendo

Posted 15 September 2013 - 05:34 PM

Another thread of fanboy crap with no connection with the real world.

The wii u is in the same area of performance as ps3/360 and they have as much claim to being next gen as wii u does.

Again for people stupid enough to read these threads and actually believe them you need to look elsewhere like eurogamer faceoffs, lens of truth etc and comparison reviews in general.

As for the comment about the gamecube vs ps2 again you can't just say the gamecube is superior full stop. Each format has advantages and disadvantages. PS2 has 5.1 sound not 2 channel sound, it has 32bit colour not 24bit to enable some good transparency effects, bloom lighting etc, it has full size dvds not mini-dvds. Games like Gran Turismo on ps2 are 1080i. Clearly the Gamecube has other huge advantages over ps2 like a massive more powerful cpu and more powerful gpu in most areas. Certain games will have advantages on different hardware depending on what resources they need.

It is so sad to see the denial of reality in this forum. You are so out of sync with real world views of the wii u techically that you see on multiformat forums.

The answer to the question posed by this thread is that the few PS4 or xbox one games that get downported to 360, PS3 or wii u will be very similar across 360, PS3 and wii u with a greater chance that they will perform a little bit better on 360 and PS3. As has been the situation so far. The only real chance of the wii u version performing between 360/PS3 and PS4/Xbone is if its a game that has lower cpu requirements but relies on more memory being available.

I hope when Watchdogs comes out and we see how the wii u performs (very close or below 360/PS3) we can stop half the unicorn tongue on these threads.


Let me ask you aquestion Punk... I'm trying to be serious with you. We have evidence from changelogs that developers are using the same "EFFECTS"for there game as the PS4 version. Yes Wii U isn't going to have the same "POWER" as PS4... But it will be able to pull of the same effect (DX11) type as they are not native DX machines only XBox machines and PC's are. I've read some of your post on here like from over a year ago and you didn't always have this negative view of Wii U tech ability. Real talk Wii U Will be able to compete its not the dame spec for spec but what does it matter if the games look good(X, MK8, Bayo2) all 2014 games. I believe 2014 will be the year Wii U flexes it muscles and show how much more capable it is than ps360. What happened to you when did it all go to sh!* for you? Ps360 had a "combined" 512mb of memory Wii U has 1GB just for its game that along shows how much more capable it is not taking into account the modern GPU capable of same effect as Ps4. Come on man.

#56 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 16 September 2013 - 05:15 AM

DP, nobody is saying that the Wii U is on par with the X1/PS4, we are only saying that it definately has a modern GPU that is capable of more modern effects, a CPU that while not as good in terms of flops performance compared to the current gen consoles, it absolutely destroys those in general processing, has twice the amount of memory, has tons of cache and edram, and Nintendo is still working on its dev kits to improve accessibility to these features.  So while the Wii U may only be a modest step up in terms raw performance, we are seeing the X1 and PS4 struggling to run a 360/PS3 game in 1080p 60fps.  Lets face it, Battlefield 4 is showing the world that the performance its going to take to make a generational leap is very costly, improving resolution, framerate, and graphical fidelity is not an easy task, even with straightforward hardware designs like the PS4 and to a lesser extent X1.  Bayonetta 2 shows the improvement over the previous gen, thats the same developer working on a sequel.  The feature set may actually be worth more than a simple increase in flops performance.  For example, a 350 gflop DX11 GPU may actually produce better visuals than a 800gflop DX9 GPU, assuming the game is developed to make full use of those DX11 features.  Flops performance isnt everything.  Even with the original xbox, it was really much more powerful than the GC or PS2, but its built in shaders made it very easy to make some very nice looking games.  Just like polygons per second never told the whole story, flops doesnt tell the whole story either.  Its just what you average enthusiast pays attention to, so Sony went and advertised it, because there machine was impressive in that category.  There are most certainly some areas that arent as impressive, and that why even Sony doesnt release a full spec sheet anymore.     



#57 GAMER1984

GAMER1984

    Lakitu

  • Members
  • 2,036 posts
  • NNID:gamer1984
  • Fandom:
    Nintendo

Posted 16 September 2013 - 08:24 AM

DP, nobody is saying that the Wii U is on par with the X1/PS4, we are only saying that it definately has a modern GPU that is capable of more modern effects, a CPU that while not as good in terms of flops performance compared to the current gen consoles, it absolutely destroys those in general processing, has twice the amount of memory, has tons of cache and edram, and Nintendo is still working on its dev kits to improve accessibility to these features. So while the Wii U may only be a modest step up in terms raw performance, we are seeing the X1 and PS4 struggling to run a 360/PS3 game in 1080p 60fps. Lets face it, Battlefield 4 is showing the world that the performance its going to take to make a generational leap is very costly, improving resolution, framerate, and graphical fidelity is not an easy task, even with straightforward hardware designs like the PS4 and to a lesser extent X1. Bayonetta 2 shows the improvement over the previous gen, thats the same developer working on a sequel. The feature set may actually be worth more than a simple increase in flops performance. For example, a 350 gflop DX11 GPU may actually produce better visuals than a 800gflop DX9 GPU, assuming the game is developed to make full use of those DX11 features. Flops performance isnt everything. Even with the original xbox, it was really much more powerful than the GC or PS2, but its built in shaders made it very easy to make some very nice looking games. Just like polygons per second never told the whole story, flops doesnt tell the whole story either. Its just what you average enthusiast pays attention to, so Sony went and advertised it, because there machine was impressive in that category. There are most certainly some areas that arent as impressive, and that why even Sony doesnt release a full spec sheet anymore.


Its sad the generation we live in the core gamer has no idea. They are chasing photorealism and any big number developers or manufactuers throw out. They don't even know the ins and out of developing a console and how all those parts fit together but they know 1.8 TB is larger than 1.2 TB. On Neogaf you have a whole group of people trying to discredit Bayonetta 2 visual as not having anything to do with the Wii U. It has more to do with the maturity of the game engine. Stating if that game was released on PS360 it would look and run the same. This is the one game I would love to see go multiplatform for ps360 just so they can see what downgrades would need to take place and how superior the Wii U version would be. But the reality is the graphic whores or tech elite will never see Wii U and have respect for what it can do. I'm done defending Wii U capabilities to others... Their minds are made up.

#58 Poptartboy

Poptartboy

    Spiny

  • Members
  • 249 posts

Posted 16 September 2013 - 09:04 AM

That's the best way to be really. Personally I have already seen stuff that was better than last gen in Pikmin 3 and W101, so I'm happy.

True there looks to be more detail in the games on the new consoles, but I don't feel like my Wii U is lacking too much like the Wii was.

I'd always like better graphics of course but what can you do.

#59 GAMER1984

GAMER1984

    Lakitu

  • Members
  • 2,036 posts
  • NNID:gamer1984
  • Fandom:
    Nintendo

Posted 17 September 2013 - 04:09 AM

This sums up our generation of gamers quite well...

http://m.youtube.com...eature=youtu.be

#60 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 17 September 2013 - 05:35 AM

Your average gaming enthusiast only knows what their favorite news outlet tells them.  If its trendy to talk about polygons per second, then thats what the media will talk about, and lately flops are all the rage, so thats what has taken center stage.  If people were to do any research at all, they would see that not only does 10x the performance not yield a linear improvement in graphics, but that the jump from the PS2 to the PS3 in flops performance was far more significant than the jump from PS3 to PS4.  The PS2 was about 6 gflops compared to the PS3's cell+RSX 250-500 gflops.  We were talking a 50x jump in flops performance from the PS2 to PS3, and now we have a PS4 that is a 6x jump over the PS3.  In outright performance the new consoles simply arent the same enormous jump that we had the previous two generations.  So even the spec that has been singled out as the be all indication of performance simply isnt that much of a jump.  It sounds like it is, until you realize the PS3 was 50x that of the PS2.  All of a sudden 6x doesnt seem so significant.  The feature set on the modern gpu will be more significant.  All those new DX11 shaders and features will be more pronounced than the increase in flops performance. 






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

Anti-Spam Bots!