Jump to content


Photo

Activision says Wii U is next gen


  • Please log in to reply
123 replies to this topic

#81 BrosBeforeGardenTools

BrosBeforeGardenTools

    Hot dog vendor who spilled condiments on himself

  • Members
  • 1,864 posts

Posted 07 May 2013 - 03:24 PM

I'll cave in. The Wii U is a technological achievement and is next generation. No sarcasm intended.

Do I agree with that? Not really. But people tend to dominate me into opinions in life so they might as well have their way here too.

#82 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 07 May 2013 - 03:47 PM

I'll cave in. The Wii U is a technological achievement and is next generation. No sarcasm intended.

Do I agree with that? Not really. But people tend to dominate me into opinions in life so they might as well have their way here too.

You seem to have missed the point.  No one wants your "cave in" or anything of the sort.  Its about getting the right information out there, not about winning arguments for the sake of winning.  

 

And it isn't about domination or control, either.  I've been proven wrong and corrected here before as I have elsewhere.  Though I might not always succeed, I try to take it with grace and accept that I simply cannot know everything or be right about everything, but by God I will have an opinion about everything.

 

So in that vein, I try to frame my opinion as opinion, and fact as fact.  Representing opinion as fact is dangerous and can make a person feel foolish.  What I know and what I think are two separate things.


Edited by routerbad, 07 May 2013 - 03:49 PM.


#83 BrosBeforeGardenTools

BrosBeforeGardenTools

    Hot dog vendor who spilled condiments on himself

  • Members
  • 1,864 posts

Posted 07 May 2013 - 04:03 PM

You seem to have missed the point.  No one wants your "cave in" or anything of the sort.  Its about getting the right information out there, not about winning arguments for the sake of winning.  
 
And it isn't about domination or control, either.  I've been proven wrong and corrected here before as I have elsewhere.  Though I might not always succeed, I try to take it with grace and accept that I simply cannot know everything or be right about everything, but by God I will have an opinion about everything.
 
So in that vein, I try to frame my opinion as opinion, and fact as fact.  Representing opinion as fact is dangerous and can make a person feel foolish.  What I know and what I think are two separate things.


Well I for one believe that the Wii U will not be very powerful compared to the PS4 and 720 because the GPU is significantly weaker and believing that the CPU is extremely powerful requires trust. Until you are a developer who has tried all three systems, saying that anything is good requires trust.

Personally I believe that if I were to pay the $8000 I need to develop a Wii U game, which may happen some day, that the performance of the Wii U will be a bit disappointing and that I will find it only about twice as powerful as my iPad 4. This is just a hunch and I would be using Unity so it probably wouldn't be optimized well for the Wii U, but I have to think in terms of what may actually happen for me sometimes too.

Now back to talking about commercial games that are optimized for the Wii U. The Wii U GPU is about 1.5x as powerful as the 360 GPU, I hear. This is actually an incredibly generous figure and it might even be less. I believe the GPU to be the biggest difference of the system so if all things are not more than 1.5x times the 360, it's about 1.5x times or so as powerful as the 360. That's not really next gen. You can point to various performance factors and technological improvements that have been made, but I have weighed each one carefully. I have even tried sdks showing off various graphics effects when they were new and got to see the difference between them, probably even in CPU usage and the like.

#84 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 07 May 2013 - 06:38 PM

PotatoHog, on 07 May 2013 - 10:03 AM, said:Well I for one believe that the Wii U will not be very powerful compared to the PS4 and 720 because the GPU is significantly weaker and believing that the CPU is extremely powerful requires trust. Until you are a developer who has tried all three systems, saying that anything is good requires trust.

Personally I believe that if I were to pay the $8000 I need to develop a Wii U game, which may happen some day, that the performance of the Wii U will be a bit disappointing and that I will find it only about twice as powerful as my iPad 4. This is just a hunch and I would be using Unity so it probably wouldn't be optimized well for the Wii U, but I have to think in terms of what may actually happen for me sometimes too.

Now back to talking about commercial games that are optimized for the Wii U. The Wii U GPU is about 1.5x as powerful as the 360 GPU, I hear. This is actually an incredibly generous figure and it might even be less. I believe the GPU to be the biggest difference of the system so if all things are not more than 1.5x times the 360, it's about 1.5x times or so as powerful as the 360. That's not really next gen. You can point to various performance factors and technological improvements that have been made, but I have weighed each one carefully. I have even tried sdks showing off various graphics effects when they were new and got to see the difference between them, probably even in CPU usage and the like.


1. NOT A SINGLE PERSON HAS SAID ANY OF THE CRAP YOU ARE TRYING TO PUT IN OUR MOUTHS.


2. Quit using the damn word generation. It does not mean what you are trying to make it mean, no definition of generation ever written anywhwere, means what you are trying to make it mean. You do not get to change the definition of a word to fit your imaginary viewpoint. WTF is wrong with you? The Xbox stomped the dreamcast, does that mean the dreamcast was part of the n64/psx gen? No, it doesnt work that way. Quit making the public school systems look bad.

3. Wii u devkits cost $5,000, not 8, or are free if you can show nintendo you have anything worth a crap.

 


banner1_zpsb47e46d2.png

 


#85 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 07 May 2013 - 07:31 PM

Well I for one believe that the Wii U will not be very powerful compared to the PS4 and 720 because the GPU is significantly weaker and believing that the CPU is extremely powerful requires trust. Until you are a developer who has tried all three systems, saying that anything is good requires trust.

Personally I believe that if I were to pay the $8000 I need to develop a Wii U game, which may happen some day, that the performance of the Wii U will be a bit disappointing and that I will find it only about twice as powerful as my iPad 4. This is just a hunch and I would be using Unity so it probably wouldn't be optimized well for the Wii U, but I have to think in terms of what may actually happen for me sometimes too.

Now back to talking about commercial games that are optimized for the Wii U. The Wii U GPU is about 1.5x as powerful as the 360 GPU, I hear. This is actually an incredibly generous figure and it might even be less. I believe the GPU to be the biggest difference of the system so if all things are not more than 1.5x times the 360, it's about 1.5x times or so as powerful as the 360. That's not really next gen. You can point to various performance factors and technological improvements that have been made, but I have weighed each one carefully. I have even tried sdks showing off various graphics effects when they were new and got to see the difference between them, probably even in CPU usage and the like.

That's actually completely inaccurate.  The Wii U GPU is ~4x more powerful than the 360 CPU and GPU combined.  The Wii U CPU doesn't require trust at all, we know what its design is based on by IBM's own admission, and even Watson himself.  Put one of the WiiU cores against a Jaguar core and it will trounce it.  It isn't as parallel as the PS4 CPU, but given similar clock speeds each core has more cache to work with and can execute more instructions per clock.  This is basic.

 

Again, next gen isn't some performance threshold that needs to be met to break a barrier, and even if it was, WiiU would certainly break through it.  Generation refers simply to when it came out, period.

 

And I'm sorry, with your obviously weak grasp of computer architecture I would have to willingly suspend disbelief to accept that you have had any experience with computer systems at all, much less hardware development kits from Nintendo, Microsoft, or Sony.  Someone may have showed you something on them once, but they aren't toys.


Edited by routerbad, 07 May 2013 - 07:35 PM.


#86 BrosBeforeGardenTools

BrosBeforeGardenTools

    Hot dog vendor who spilled condiments on himself

  • Members
  • 1,864 posts

Posted 08 May 2013 - 04:46 AM

I don't see how the Wii U GPU is 4x as powerful as the 360 GPU. They have the same amount of ROPs and TMUs, and the Wii U has only a few more shader units. The clock speed of the GPU is similar as well.

I was also talking about PC sdks.

And 3Dude, you are right about the dev kit costing $5000. The fact that you know this shows you know a lot. But then there's $500 for registering a company and the development costs of the game.

#87 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 08 May 2013 - 08:26 AM

I don't see how the Wii U GPU is 4x as powerful as the 360 GPU. They have the same amount of ROPs and TMUs, and the Wii U has only a few more shader units. The clock speed of the GPU is similar as well.

I was also talking about PC sdks.

And 3Dude, you are right about the dev kit costing $5000. The fact that you know this shows you know a lot. But then there's $500 for registering a company and the development costs of the game.

You are referring to only the 50% of the GPU that we know about, the other half is still doing something.  Also, no one knows for sure how many TMUs and ROPs there are.  There are educated guesses based on what is known of other AMD GPU's, but those guesses fail to take into account that the entire GPU is custom, the only thing that looks to have been even derived from an AMD GPU is the SIMD engines.  Everything else is unique.

 

Also, at 90nm, Xenos was 180mm2, while at 40nm, Latte is 450mm2, there is a lot more going on in there than anyone knows. It also has much more modern logic that supports modern engines, API's, and effects.   

 

Okay, you've played with SDK's, I don't get your point.  Anyone can download an SDK and run the test code.


Edited by routerbad, 08 May 2013 - 08:27 AM.


#88 BrosBeforeGardenTools

BrosBeforeGardenTools

    Hot dog vendor who spilled condiments on himself

  • Members
  • 1,864 posts

Posted 08 May 2013 - 05:38 PM

You are referring to only the 50% of the GPU that we know about, the other half is still doing something.  Also, no one knows for sure how many TMUs and ROPs there are.  There are educated guesses based on what is known of other AMD GPU's, but those guesses fail to take into account that the entire GPU is custom, the only thing that looks to have been even derived from an AMD GPU is the SIMD engines.  Everything else is unique.
 
Also, at 90nm, Xenos was 180mm2, while at 40nm, Latte is 450mm2, there is a lot more going on in there than anyone knows. It also has much more modern logic that supports modern engines, API's, and effects.   
 
Okay, you've played with SDK's, I don't get your point.  Anyone can download an SDK and run the test code.


There was a picture or pictures taken of the insides of the Wii U and the experts now believe it has 8 rops, 16 tmus, and 320 shader units. And we know the clock speed, 550MHz.

http://www.eurogamer...inally-revealed

The PS4 is said to have 8 rops, 18 tmus and 1152 shader units, and a clock speed of 800MHz.

I suppose you're going to say that the experts are wrong and don't know how to properly understand the Wii U GPU though?

#89 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 08 May 2013 - 07:45 PM

There was a picture or pictures taken of the insides of the Wii U and the experts now believe it has 8 rops, 16 tmus, and 320 shader units. And we know the clock speed, 550MHz.

http://www.eurogamer...inally-revealed

The PS4 is said to have 8 rops, 18 tmus and 1152 shader units, and a clock speed of 800MHz.

I suppose you're going to say that the experts are wrong and don't know how to properly understand the Wii U GPU though?

Funny you should reference the Eurogamer article that pissed off the GAF users that actually made the die shot happen.  Because the article you referenced actually never sourced NeoGAF as a source for the information, and jumped to conclusions that were being actively explored at the time in order to get quick clicks out of it.  Everything in that article is garbage.

 

If you want to see the real thing, go to GAF or the thread here. 

 

I am going to say that the "experts" that actually analyze DIE SHOTS for a living, the ones that took the microlithographs have said quite a bit on the chip.  First, that it is completely custom, the only bit of logic that resembles anything from AMD are the SIMD engines.  Everything else, totally custom.  They placed the price of the chip at ~$100 just for the GPU die.  Another thing, its massive, more massive than most GPU's and the EDRAM doesn't account for much of the size.  Here's the most important bit: The "experts" can't figure out what about 50% of the Wii U GPU logic is there for. 

 

 

I'll say it again, the "experts" have said that they can't figure out what 50% of the GPU logic is there for.  THEY HAVE SAID THIS.

 

Since you used Eurogamer as your source, I'll assume you haven't read through any of the GAF thread, where all of the actual info is, and where all of the leaks happened.  There are people posting in that thread that have inside info from both developer and engineer perspectives.  Last I saw, there was no concensus on exactly how many ROPs and TMUs there were, because they were hard to physically pinpoint on the die.  There were also suggestions that there were assymetric ALU's, meaning they weren't lined up like they are on typical GPU's and that the 8 ALU's that we saw were only a portion of the total number. 

 

There was also a leak, that the other 50% of the chip was dedicated to effects that typically require a lot of SIMD performance and put a lot of strain on the SIMD engines.

 

It isn't all clock speeds, FLOPS, and ALU's, son.  There is much more going on in a GPU than that, which is why I keep saying it only tells part of the story. 

 

Also, no one, not one single person on this board, ever, has argued that the PS4 is less powerful than the Wii U, so stop trying to argue against a strawman.  Putting words in our mouths will get you no where.

 

What we have said is that the power difference won't be enough to make a huge difference visually.



#90 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 09 May 2013 - 04:27 AM

Amnesia, on 08 May 2013 - 11:38 AM, said:There was a picture or pictures taken of the insides of the Wii U and the experts now believe it has 8 rops, 16 tmus, and 320 shader units. And we know the clock speed, 550MHz.

http://www.eurogamer...inally-revealed

The PS4 is said to have 8 rops, 18 tmus and 1152 shader units, and a clock speed of 800MHz.

I suppose you're going to say that the experts are wrong and don't know how to properly understand the Wii U GPU though?




You might want to double check your ps4 info, because if thats true, the systems laughably weaker than i expected it to be, and will only have a real advantadge in shader work, which 99% of you here cant even recognize the difference between prebaked and realtime.

Although, not enough power and shaders to keep consoles from ruining ur4 for everybody.


banner1_zpsb47e46d2.png

 


#91 BKSmash

BKSmash

    Spear Guy

  • Members
  • 94 posts
  • Fandom:
    The Greatest Of The Greatest

Posted 09 May 2013 - 04:49 AM

Finally, someone that is on the right side of ''history''

 

now they only need to stop talking down to us Nintendo players, and release COD Ghosts for Wii U 


Edited by BKSmash, 09 May 2013 - 04:51 AM.


#92 BrosBeforeGardenTools

BrosBeforeGardenTools

    Hot dog vendor who spilled condiments on himself

  • Members
  • 1,864 posts

Posted 09 May 2013 - 05:06 AM

Because it keeps getting mentioned over and over, I never thought that anyone was saying the PS4 is weaker. But I realize that they are saying that the Wii U is just as good. I was of the side that the PS4 was multiple times more powerful than the Wii U.

#93 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 09 May 2013 - 05:37 AM

Because it keeps getting mentioned over and over, I never thought that anyone was saying the PS4 is weaker. But I realize that they are saying that the Wii U is just as good. I was of the side that the PS4 was multiple times more powerful than the Wii U.


No dude, it hasnt been mentioned anywhere that wii u will be as powerful as ps4. I personally have said over and over and over again tgat ps4 will be around 4x more powerful... Which incidentally is practically nothing compared to past generations (xbox vs dreamcast was a much larger gap, hell even xbox vs ps2).

The wii u will have everything the ps4 will have, just less of it and smaller scale. That is a first in the history of game consoles. There has NEVER been a generation where all consoles have the same feature set.

The nes had nowhere near the color capabilities of the master system, genesis could not perform mode 7 panning rotating and zooming like snes, xbox had programmable shaders which the ps2/cube couldnt match. Wii lacked hd and programmable shaders.

This gen. Just the same stuff but more power. Its the weakest jump in history.


People have said the ps4 is going to be a weak jump. But no one said its going to be only as powerful as wii u.

What we HAVE said is wii u is more powerful than ps360, which certain people try to deny PVER AND OVER AND OVER again.

For example, you said wii u has 320 shader units?

Ps3 had 24, and 360 had 48.

Do NOW get what we are saying?

banner1_zpsb47e46d2.png

 


#94 BrosBeforeGardenTools

BrosBeforeGardenTools

    Hot dog vendor who spilled condiments on himself

  • Members
  • 1,864 posts

Posted 09 May 2013 - 05:59 AM

There are multiple ways of figuring shader units. The Wii U has 320 and the 360 has 240. I'm not sure how to figure the PS3's since it's been awhile but it's probably on a similar level. It has split pixel and vertex shaders as well.

I figure at this point the PS4 is 2-3x more powerful than the Wii U. I don't remember anyone saying that Wii U was worse than 360/PS3, but oh well.

Edited by Amnesia, 09 May 2013 - 08:10 AM.


#95 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 09 May 2013 - 06:24 AM

When you figure shader units that way, the Wii U has 80. There are multiple ways of figuring shader units. Or the Wii U has 320 and the 360 has 240. I'm not sure how to figure the PS3's since it's been awhile but it's probably on a similar level. It has split pixel and vertex shaders as well.
I figure at this point the PS4 is 2-3x more powerful than the Wii U. I don't remember anyone saying that Wii U was worse than 360/PS3, but oh well.


Im talking shader pipelines, not operations per cycle. You said shaders, that implies shader pipelines, as in hardware, and not a measurement of operations. im simply going off what you said. This is why its important to use correct terminology. As in the correct definition of words. Number of shader units and shader operations per cycle do not scale linearly across different architectures. Although that number would be a good fit for the units of a standard decent gpu around 2008-2010.

Although, looking back at your ps4 numbers, it looks like your source confused ops per cycle with shader units.

Ps3 has 136 shader ops per cycle by the way, rsx was really weak.

If you are going off of what i think.you are going off of, forum board posters you will also notice they arent sure these are what they are, and that they account for less than 50% of the logic circuits on the gpu.

You will also notice that what they believe are the shader units have WAYYYYYYY more sram attached to them than comparable units. This means they get more shader operations per unit then the units they are being compared too. Reminding us that shader operations per cycle do not scale linearly across different architectures.

banner1_zpsb47e46d2.png

 


#96 BrosBeforeGardenTools

BrosBeforeGardenTools

    Hot dog vendor who spilled condiments on himself

  • Members
  • 1,864 posts

Posted 09 May 2013 - 06:34 AM

Shader pipelines was the correct way of figuring shaders up until the HD 2900, when ATI called 64 vec5 shader pipelines 320 shaders. After that, everyone seemed to adjust. When the 360 was first released, people probably said it had 48 shader pipelines. But now it seems more correct to say 240 shaders (48 vec5 shader pipelines).

Edited by Amnesia, 09 May 2013 - 08:10 AM.


#97 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 09 May 2013 - 06:45 AM

Shader pipelines was the correct way of figuring shaders up until the HD 2900, when ATI called 64 vec5 shader pipelines 320 shaders. After that, everyone seemed to adjust. When the 360 was first released, people probably said it had 48 shader pipelines. But now it seems more correct to say 240 shaders (48 vec5 shader pipelines). People seem to believe the Wii U to have 80 vec4 shader pipelines.

Thats marketing speak dude. Dont fall into that trap, marketers will always find unscrupulous ways to put a larger number. It doesnt mean its correct, and it doesnt mean it wont make you look bad.

For years apple posted fake clock speeds because they couldnt get people to understand their short pipelined processors were faster at computing than higher clocked processors.

So they put what they called 'comparable' clock speeds on their boxes, which were, well, fake.

But that doesnt change the argument at hand, which is those much beefier units with.a lot more ram arent going to have the same shader operations per unit per clock as the smaller weaker ones.

Shader operations per unit per clock dont scale linearly across different architectures. Which, ironically, is why they switched from posting hardware units to ops per cycle in the first place.

Edited by 3Dude, 09 May 2013 - 06:50 AM.

banner1_zpsb47e46d2.png

 


#98 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 09 May 2013 - 07:16 AM

Shader pipelines was the correct way of figuring shaders up until the HD 2900, when ATI called 64 vec5 shader pipelines 320 shaders. After that, everyone seemed to adjust. When the 360 was first released, people probably said it had 48 shader pipelines. But now it seems more correct to say 240 shaders (48 vec5 shader pipelines). People seem to believe the Wii U to have 80 vec4 shader pipelines.


That's actually not what happened at all. 64 VLIW5 shaders is 64 shaders, capable of 320 shader operations. That was 64 shaders @ 5 sh ops per cycle.

Each VLIW5 shader in the unified shader architecture introduced with HD5xxx is capable of 10 shader ops per cycle

WiiU is VLIW5

#99 BrosBeforeGardenTools

BrosBeforeGardenTools

    Hot dog vendor who spilled condiments on himself

  • Members
  • 1,864 posts

Posted 09 May 2013 - 08:08 AM

That's actually not what happened at all. 64 VLIW5 shaders is 64 shaders, capable of 320 shader operations. That was 64 shaders @ 5 sh ops per cycle.
Each VLIW5 shader in the unified shader architecture introduced with HD5xxx is capable of 10 shader ops per cycle
WiiU is VLIW5


You're right, it seems to be VLIW5. They are saying that 320 shader ops is most likely, which would be 352 GFLOPS.

#100 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 09 May 2013 - 08:16 AM

You're right, it seems to be VLIW5. They are saying that 320 shader ops is most likely, which would be 352 GFLOPS.


Right, which accounts for less than half of the logic on the chip.

 

 

Does anyone remember what Sony claimed the performance of RSX was?  Anyone?

 

Spoiler


Edited by routerbad, 09 May 2013 - 08:26 AM.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

Anti-Spam Bots!