Jump to content


Photo

Is the wii u THAT underpowered?


  • Please log in to reply
160 replies to this topic

#81 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 31 May 2013 - 06:35 AM

I'm going to go ahead and bring up the argument I used against a friend. We are not at the point of marginal returns. Console makers make their consoles less powerful so as not to lose so much money and then want you to believe that. Until we are at a molecular level in games, there will always be room for improvement.


actually things are getting kind of rough for silicon on insulator technology. We are reaching the physical limits of dennard scaling.

You cant scale to the molecular level with current material technology.

Thats why the ghz race ended, and we went multicore. And thanks to amdahls law, multicore also has a cap. We need more powerful individual cores again....

We need a new material technology to get back on track to the rate of improvement that was enjoyed when moores law was still in effect.

It will come.

banner1_zpsb47e46d2.png

 


#82 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 31 May 2013 - 06:48 AM

I'm going to go ahead and bring up the argument I used against a friend. We are not at the point of marginal returns. Console makers make their consoles less powerful so as not to lose so much money and then want you to believe that. Until we are at a molecular level in games, there will always be room for improvement.

It has nothing to do with losing money.  Their silicon budgets for the internals of the consoles are a determinate factor for price point, they aim to make their money back some way, and the gaming market expanded by a factor of three last generation thanks to "casual gaming" so once the software library is up to snuff console makers will make their money back, except for Sony, they still haven't made their money back on PS3.

 

Diminishing returns isn't something that goes away because you don't believe in it.  GPU makers put out high end cards for 1000s of dollars in some cases that have more shaders and more RAM than mainstream or budget cards, but all they can do is a few more frames or a higher benchmark score. 

 

Similar to the CPU "add more GHz" then "add more cores" GPU makers are saying "add more shader cores" then "add more RAM" "wider memory bus".  The problem is that more shader cores and more RAM aren't getting the same performance gains they used to, because as these systems get more complex and powerful, the graphics engines that run on them become more efficient and scalable.  Meaning the engines can do more with less hardware, but having more hardware is no benefit to anyone.



#83 BrosBeforeGardenTools

BrosBeforeGardenTools

    Hot dog vendor who spilled condiments on himself

  • Members
  • 1,864 posts

Posted 31 May 2013 - 06:50 AM

The problem is that more shader cores and more RAM aren't getting the same performance gains they used to, because as these systems get more complex and powerful, the graphics engines that run on them become more efficient and scalable.  Meaning the engines can do more with less hardware, but having more hardware is no benefit to anyone.


Makes sense. :D

#84 Cloud Windfoot Omega

Cloud Windfoot Omega

    Cheep-Cheep

  • Members
  • 148 posts

Posted 31 May 2013 - 06:58 AM

actually things are getting kind of rough for silicon on insulator technology. We are reaching the physical limits of dennard scaling.

You cant scale to the molecular level with current material technology.

Thats why the ghz race ended, and we went multicore. And thanks to amdahls law, multicore also has a cap. We need more powerful individual cores again....

We need a new material technology to get back on track to the rate of improvement that was enjoyed when moores law was still in effect.

It will come.

Yeah  one they start wokring with graphene we will start  reaching  higher. But we would also start reaching the maximum possible,  probably wont start at 1 atmom thick off the bat with it though :P
 
 
Also i would like to note about DDR3 vs GDDR5. Not many people know this but, DDR3 when it first came out, even with its  much higher speeds  accually preformed  the same (sometimes with a unnotceable speedup) thanks to  the timings going up, sure the speed when up but thanks to all the timings going down so did the  ability to move small code changes  ihn a timly manner, which  luckily since the  timings were only small changes did not  cripple CPU's. Now Cpu's are designed to work wioth these slightly higher  timings, although its not perfect.
 
Basicly  by increasing the  speed of data, we lower timings also known as latancy. Higher speeds are great for moving large single packages of data, sure it would start slowly but since it is moving the  large data package, it would not really matter since it will be ending that much quicker.
 
 
Now comes in GDDR5 named for being used in graphic cards nearly exclusively.Great for moving textures around since they are gennerally the large files. But it comes at a price of high latancy that can cause hiccups crippleing cpu's and game mechanics. Unless the cpu has extensive changes to overcome the high latancy, i personally would go with DDR3 atm.  Sony just wants some numbers to flash in peoples faces.

It has nothing to do with losing money. Their silicon budgets for the internals of the consoles are a determinate factor for price point, they aim to make their money back some way, and the gaming market expanded by a factor of three last generation thanks to "casual gaming" so once the software library is up to snuff console makers will make their money back, except for Sony, they still haven't made their money back on PS3.

Diminishing returns isn't something that goes away because you don't believe in it. GPU makers put out high end cards for 1000s of dollars in some cases that have more shaders and more RAM than mainstream or budget cards, but all they can do is a few more frames or a higher benchmark score.

Similar to the CPU "add more GHz" then "add more cores" GPU makers are saying "add more shader cores" then "add more RAM" "wider memory bus". The problem is that more shader cores and more RAM aren't getting the same performance gains they used to, because as these systems get more complex and powerful, the graphics engines that run on them become more efficient and scalable. Meaning the engines can do more with less hardware, but having more hardware is no benefit to anyone.

Every generation would need to scale by a larger factor than the last. This generation would need to see about 100 times the last to see a  eyepopping diffrence, then the next would need 10 times that to top its last,  meaning about 1000 times its last. The next generations will be funny to see, thanks to the internet and its  infinant "wisdom"


Edited by Cloud Windfoot Omega, 31 May 2013 - 07:04 AM.


#85 MorbidGod

MorbidGod

    Hammer Bro.

  • Members
  • 1,717 posts

Posted 31 May 2013 - 10:25 AM

The GPU will not be a 7970, for starters, at the very most we're looking at a 7770. Sony is big on putting out numbers that tend to not be true in real life situations, so I wouldn't even count on that at this point. They claimed RSX had 1.8TFLOPS as well.

You are right in assuming that they'll run into hardware bottlenecks long before memory bottlenecks, where those hardware bottlenecks are is anyone's guess at this point.

The reason MS went with a lower spec, and even Nintendo went with a lower spec is that much of that additional power will be wasted. PS4 having more power will not show on screen due to the diminishing returns we're seeing in the GPU space, and why GPU makers just add more shaders linearly along with more RAM for their more powerful cards. Those cards are MUCH more power hungry, run MUCH hotter, yet produce maybe a few more frames than a midrange or budget card. The benefit of too much power in a closed system like this is limited. You have to find a balance that works.


Exactly, Microsoft is an OS company, and that shows through brilliantly on the XBONE.

Nintendo is a game company, OS's and online infrastructures aren't in their natural wheelhouse, though I'm sure they're learning very quickly to make it their wheelhouse.


I hope they work quickly. Netflix no longer works on my Wii U :(
Whovian12 -- Nintendo Network ID.

#86 Rockodoodle

Rockodoodle

    Chain Chomp

  • Members
  • 677 posts

Posted 31 May 2013 - 10:42 AM

Excellent post.  Sounds like a maxim in amateur astronomy on steroids.  Most scopes go in increments of 2 inches. If you have a 8 inch scope, they say to skip 10 and go to 12 to see an improvement.  I think we'll have to wait five more years to really see a difference from the last generation.   

 

 

 

Yeah  one they start wokring with graphene we will start  reaching  higher. But we would also start reaching the maximum possible,  probably wont start at 1 atmom thick off the bat with it though :P
 
 
Also i would like to note about DDR3 vs GDDR5. Not many people know this but, DDR3 when it first came out, even with its  much higher speeds  accually preformed  the same (sometimes with a unnotceable speedup) thanks to  the timings going up, sure the speed when up but thanks to all the timings going down so did the  ability to move small code changes  ihn a timly manner, which  luckily since the  timings were only small changes did not  cripple CPU's. Now Cpu's are designed to work wioth these slightly higher  timings, although its not perfect.
 
Basicly  by increasing the  speed of data, we lower timings also known as latancy. Higher speeds are great for moving large single packages of data, sure it would start slowly but since it is moving the  large data package, it would not really matter since it will be ending that much quicker.
 
 
Now comes in GDDR5 named for being used in graphic cards nearly exclusively.Great for moving textures around since they are gennerally the large files. But it comes at a price of high latancy that can cause hiccups crippleing cpu's and game mechanics. Unless the cpu has extensive changes to overcome the high latancy, i personally would go with DDR3 atm.  Sony just wants some numbers to flash in peoples faces.

Every generation would need to scale by a larger factor than the last. This generation would need to see about 100 times the last to see a  eyepopping diffrence, then the next would need 10 times that to top its last,  meaning about 1000 times its last. The next generations will be funny to see, thanks to the internet and its  infinant "wisdom"



#87 Nintyfan86

Nintyfan86

    Bob-omb

  • Members
  • 262 posts

Posted 31 May 2013 - 06:32 PM

The GPU will not be a 7970, for starters, at the very most we're looking at a 7770.  Sony is big on putting out numbers that tend to not be true in real life situations, so I wouldn't even count on that at this point.  They claimed RSX had 1.8TFLOPS as well.

 

You are right in assuming that they'll run into hardware bottlenecks long before memory bottlenecks, where those hardware bottlenecks are is anyone's guess at this point.

 

The reason MS went with a lower spec, and even Nintendo went with a lower spec is that much of that additional power will be wasted.  PS4 having more power will not show on screen due to the diminishing returns we're seeing in the GPU space, and why GPU makers just add more shaders linearly along with more RAM for their more powerful cards.  Those cards are MUCH more power hungry, run MUCH hotter, yet produce maybe a few more frames than a midrange or budget card.  The benefit of too much power in a closed system like this is limited.  You have to find a balance that works.

 

Exactly, Microsoft is an OS company, and that shows through brilliantly on the XBONE.

 

Nintendo is a game company, OS's and online infrastructures aren't in their natural wheelhouse, though I'm sure they're learning very quickly to make it their wheelhouse. 

 

That is what perplexed me about the PS4 and GAF proclaiming this 7970m theory. Of course, the rumors of both machines using 7770 variants were out there for a long time before this, which had me believing that the Wii U would be a no brainer with regard to multi plats. However, GAF, as you suggested earlier, appears to be something like Gamefaqs for wanna be tech heads, with a microscopic percentage of actual knowledgeable people. 

 

 

Your GPU comment appears accurate, especially with AMD's road map, and Nvidia's release. We are not really seeing quantum leaps anymore. Only newer, slightly more efficient ways to do the same thing slightly faster. The power draw may drop slightly on some cards, yet increase on others. It depends on whether or not it is a rebrand or an actual reconfiguration. In either event, it is as though a new GPU card in XFire or SLI is good for 3 years before there is even a reason to upgrade. 

I'm going to go ahead and bring up the argument I used against a friend. We are not at the point of marginal returns. Console makers make their consoles less powerful so as not to lose so much money and then want you to believe that. Until we are at a molecular level in games, there will always be room for improvement.

But, the Tflops and cores! Honestly, the scary thing about the path of increased graphics on a linear scale will be production costs, and an inevitable situation where you get a few clones of 3 genres. To EA's credit, they tried some new things with Mirror's Edge, Dead Space, and a few others in 2009, but that was probably not the best time to do it. 

 

If anything, those IOS games and Nintendo's Indie initiative is probably a very good thing. Looking at tech as a prop is, quite possibly, the best thing the industry can do. After all, when you watch a movie, do you go on a forum and argue about the technicalities of a movies' special effects vs another? Games and systems should, hopefully, reach that same standard. Art. 

 

 

 

 

First question: OS, why was it in such a bad shape? Honestly, Nintendo sucks at OS creation. The Wii was vwry basic, and the Wii U was an upgrade to it's design. I think they should use Linux, but they haven't listened yet.

As for the developer tools, Nintendo had no excuse. I don't know why they didn't give proper help to the developers. They learned from that mistake, but it was too late.

Hopefully that helps.

After seeing routerbad's comments, I am not sure MS went with a lower spec. If the machine is a 7770 with the same CPU, and DDR5 memory vs embedded ram, a similar gpu/cpu setup, and the same DDR3, I am no so sure as to how the PS4 has a real advantage. It would be like giving a budget card 8 gigs of ram when it could not possibly use it. I do agree about diminishing returns. Sitting on a couch, 4-10 feet away, and with a controller will limit the need for a constant 60 fps. 

 

I can see the potential with the OS. I really want to see the whole thing get as fast as it is when going from in game to the browser. I think this is possible. Do you? 



#88 Desert Punk

Desert Punk

    Chain Chomp

  • Members
  • 656 posts

Posted 31 May 2013 - 11:17 PM

If you look at some of the analysis going on you will see many are viewing the ps4 now as even more powerful than xbox one than it first appeared. Some are predicting that 2 of the jaguar cores are locked for the operating system and kinect on xbox one but only one is locked on ps4. The xbox one needs far more memory for its operating systems. Also ROPs on the xbox one are half the number of ps4.

 

For those criticising the ps4's gddr memory you have to understand ps4 and xbox one are gpu centric designs and gddr is ideal for this. The amd jaguar cores are not super powerful, combined at 1.6ghz they are only about 40-70,000 dmips. If its as low as 40,000 and the xbox one uses 2 cores the xbox one is only 30,000 for games. The cell processor was upto about 30,000 dmips but one spu used for the operating system meaning about 24,000 for games. Clearly the xbox one is still massively more powerful than ps3 as the cell is hard to fully utilise and xbox one has huge assistance from its gpu but its still surprisingly close.

 

I think the main reason the DDR of the xbox one is being criticised is the insignificance of the cpu's in these consoles.

 

There are quite a few articles about Microsoft's failure with the design on the xbox one. Ages ago there was a document on the web that the next xbox would still have a powerpc chip in addition to an x86 cpu. It was expected that the powerpc chip would run 360 games and also be used to do kinect processing for the xbox one mode. It's almost like Microsoft had planned to offer backwards compatibility but cost-cutted the powerpc bit.



#89 Plutonas

Plutonas

    Pokey

  • Members
  • 1,319 posts

Posted 01 June 2013 - 02:02 AM

Remember PS4 presentation? They did said that PS4 is nearly 2 tflop theoretical peak.. They did said that.

 

At 2005 Sony said :  PS3 is a 2 tflop machine  (stronger than ps4)... So yes, dont say big words! The following link is from 2005 http://uk.ign.com/ar...s3-tech-specs-2

 

Lets see how they perform face 2 face, because wii U is stronger than ps3 after all.


Edited by Plutonas, 01 June 2013 - 02:04 AM.


#90 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 01 June 2013 - 06:27 AM

If you look at some of the analysis going on you will see many are viewing the ps4 now as even more powerful than xbox one than it first appeared. Some are predicting that 2 of the jaguar cores are locked for the operating system and kinect on xbox one but only one is locked on ps4. The xbox one needs far more memory for its operating systems. Also ROPs on the xbox one are half the number of ps4.
 
For those criticising the ps4's gddr memory you have to understand ps4 and xbox one are gpu centric designs and gddr is ideal for this. The amd jaguar cores are not super powerful, combined at 1.6ghz they are only about 40-70,000 dmips. If its as low as 40,000 and the xbox one uses 2 cores the xbox one is only 30,000 for games. The cell processor was upto about 30,000 dmips but one spu used for the operating system meaning about 24,000 for games. Clearly the xbox one is still massively more powerful than ps3 as the cell is hard to fully utilise and xbox one has huge assistance from its gpu but its still surprisingly close.
 
I think the main reason the DDR of the xbox one is being criticised is the insignificance of the cpu's in these consoles.
 
There are quite a few articles about Microsoft's failure with the design on the xbox one. Ages ago there was a document on the web that the next xbox would still have a powerpc chip in addition to an x86 cpu. It was expected that the powerpc chip would run 360 games and also be used to do kinect processing for the xbox one mode. It's almost like Microsoft had planned to offer backwards compatibility but cost-cutted the powerpc bit.


Dude, stop, this was already destroyed in the other thread. Stop throwing fake numbers around. This was already destroyed in the other thread you abandoned. Hell, I posted ibm's official cell documentation and programming instrunctions as a downloadable attatchment to the thread.

ppe gets .59 dmips per Mhz, there is no way in hell you are getting 30,000 dmips out of that. The spe's wont help very much with drhystones, and sure as hell not 6,000 per spu. Do you understand the difference between dhrystones and whetstones?

PS3 Cell 3.2GHz: 1879.630 Dhrystone v2.1

Thats it guy. 1879 dmips, no matter how many times you run the benchmark, you are not going to get up to 30,000.

You do not get 6,000 dmips per spu. Your own made up math doesnt even add up. if each spu got 6,000 dmips (1 said one spu taken away decreases dmips by a difference of 6,000, so you claimed each spu gets 6,000 dmips), that would be 42,000 dmips, not 30,000, not even including the ppe, you know the ACTUAL PROCESSOR. Stop. Holy crap stop.

Just. Stop. Stop making up random numbers.

Edited by 3Dude, 01 June 2013 - 06:31 AM.

banner1_zpsb47e46d2.png

 


#91 Desert Punk

Desert Punk

    Chain Chomp

  • Members
  • 656 posts

Posted 02 June 2013 - 02:59 AM

The reason I don't respond to many of your threads is you simply have no connection with reality, you are the nutter on the bus that most people avoid.

 

However just to make this super clear I will try to make this super easy for someone even like yourself to follow;

 

First of all integer performance is dealing with 0 and 1's only its much easier for a cpu to deal with integer performance than floating point so its something the spe's do to a higher level than floating point. This is nothing hard to understand. A nice summary of integer peformance for various cpu's is here;

 

http://en.wikipedia....ions_per_second

 

For ps3 they state;

 

PS3 Cell BE (PPE only) 10,240 MIPS at 3.2 GHz (not including the seven spe's (six for games)

 

Each spe has a theoretically maximum of 25.4gflops each and the spe can do 2 integer operations for one single precision floating point operation. However the real world figure given for the cell processor in total is around 30,000 for integer operation. You only have to look below to see how much of the cell is dedicated to spe's to see its not going to be a low figure. Look at the size of it. One is mapped out to enable high yields. Considering just a small area of the cell is generating over 10,000 dmips who in their right mind wouldn't believe 30,000 dmips in total as a bare minimum. This is a figure given for real world performance I believe. I'm sure the theoretical integer performance would be much higher. The reality is with the cell though that there is a large divide between theoretical and  real world figures but I have not used any theoretical figures. When the cell is fully optimised even with the ps3's weaker gpu it can surpasse the 360 by some margin. How would it do this is the cell wasn't significantly more powerful when its gpu and memory configuration is weaker? Clearly though the spe's can be used for either floating point and integer calcuations and most of the time floating point would be used to assist the ps3 graphically. Integer calculations may be used for logic and sound and of course the ps3's background operating system which is mainly integer.

 

Many games don't make use of the spe's at all so spe use is mainly integer for the operating system. Many just use it for sound processing which is often integer use in addition to the operating system. I remember reading that Orange box made no use of the spe's at all but then the pc engine was a bit dated anyway and ran well enough on the two threads of the PPE.

 

cell.jpg

 

xenons

 

Xbox360 IBM "Xenon" (Triple core) 19,200 MIPS at 3.2 GHz

 

(thats full xenon integer performance)

 

AMD FX-8150 (Eight core) 108,890 MIPS at 3.6 GHz

 

This is not jaguar cores but the earlier cores, the jaguar performance is in the range of 10-40% better.

 

So this would be 48,000 dmips at 1.6ghz and then depending how much you add for jaguar performance. It's at least 10% more I believe you end up with a figure of a minimum of 50,000 plus. Obviously there are some sources with slight variation of these figures. The figures I gave before actually seem a bit conservative but are in the ballpark figure.

 

Wii u performance is easy its already been discovered that core 0 operates only for wii mode and all cores are the same and are as per the original gamecube and wii cpu's but with more cache (needed for the low memory bandwidth) so you have a figure of about 8,400 if you use the powerpc 750 from that list which is basically the architecture for gekko, broadway and expresso. This figure may be conservative but it will be well within 10,000 dmips. I think 8,800 was given on neogaf by someone.

 

Can you please stop writing your utter tripe about wii u cpu performance, it has absolutely no connection with reality. The AMD jaguar's cpu performance may not be amazing but properly written multi-threaded games optimised for the cpu and the fact the gpu's in the ps4/xbone massively assist the cpu anyway create a serious jump in performance over wii u/360 and ps3. The wii u's cpu is inferior to the 360 and PS3 and only because of the wii u's gpu does it offer comparable performance to those consoles although currently the wii u is weaker on average most of the time.

 

Again what is the point of your posts, they bear absolutely no relationship to how the wii u is performing. It's no point arguing with me until the wii u starts performing to a higher level than it currently is. Even with its more advanced gpu that assists the cpu the wii u is struggling not to drop frames. The idea that the wii u cpu is competitive with ps4/xbone is so utterly ridiculous when its struggling against near end of life consoles.


Edited by Desert Punk, 02 June 2013 - 03:11 AM.


#92 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 02 June 2013 - 03:38 AM

mips and dmips are not the same thing. drhystones >< number of integer operations performed. Dhrystones measure integer performance, but the drystone unit of measurment is how many times the drystone routine is completed, not how many integer operations are completed holy crap educate yourself. It takes many more than a single integer instruction to make a drhystone, and different architectures and different compilers take more or less integer instructions to complete the routine. Its why you have to actually run a benchmark instead of making up numbers.


Stop posting your nonsense. posting a picture of cell and then making up numbers proves.nothing.

I posted the actual ibm documentation of cell, with its actual performance metrics, and it says you are a moron.

You are wrong, you have provided nothing to support your blathering while i have the guide to cell in my hands.

Oh, your idiotic made up dhrystones figures for xbone/ps4 are 5x higher than an actual kibini jaguar core*8 @2.0Ghz. Probably because you had no idea what drhystones actually were.

Oh HA HA HA HA HA HA HA HA!!!!!! HA HA HA!!!!! AHHHH!!!!! HA HA HA HA HA!!!!!

YOU THINK JAGUAR IS THE SAME LINE AS BULLDOZER!!!!!!

AH HA HA HA HA HA HA !!!!!!!! HA HA!!!! COUGH!!! HA HA HA HA!!!!!!

No you tool, pile driver came after bulldozer. jaguar isnt the successor to amd's attempt at the biggest baddeat desktop processor. Its the successor to bobcat, amd's netbook/mobile/laptop architecture. its 15% more.powerful than bobcat.

http://en.wikipedia.org/wiki/Bobcat_(microarchitecture)
THIS is the page you should be linking to, you crazy fanboy you.

jAGUAR-oVERALL.png

All of the improvements made to jaguar only sees a 15% performance increase over bobcat. Well, assuming you raise the clock 10% over 1.7 Ghz, which... didnt happen.

But hey, amd are delusional fanboys right? Of COURSE jaguar blows bulldozer out of the water... *snicker*.

You are so done.

So, reality hit yet? What are your thoughts on ps4/xbone cpu only being a tiny fraction as powerful as piledriver... because its only the next processor in the bobcat architecture. What are you going to make up to make this go away?


Edited by 3Dude, 02 June 2013 - 04:56 AM.

banner1_zpsb47e46d2.png

 


#93 gronik

gronik

    Spiked Goomba

  • Members
  • 19 posts
  • Fandom:
    Simon the SorcererTekkenQuintetZelda

Posted 02 June 2013 - 04:52 AM

mips and dmips are not the same thing. Stop posting your nonsense. posting a picture of cell and then making up njmbers proves.nothing.

I posted the actual ibm documentation.of.cell, with its actual performan e metrics, and it says you are a moron.

You are wrong, you have provided nothing to support your blathering while i have the guide to cell in my hands.

Oh, your idiotic made up dhrystones figures for xbone/ps4 are 5x higher than an actual kibini jaguar core*8 @2.0Ghz.

I really can't see the logic in discussing performance of consoles at the beginning of their life’s, I mean PS4 and Xbox One haven’t even been officially released yet!
  I think I remember when Sony originally unleashed details on the PS3 stating all kinds of numbers which didn't quite make it to release.

But; what did we get from the last generation of consoles?
Did we get a bunch of only rubbish games and wasted memories? Pfft only if you looked for them!
Ps3, Xbox 360 and Wii all had games which proved their respective worth.
  They all had games which stood out and showed that the consoles could perform above our expectations (or at least mine!)
There is no doubt in my mind that the Wii U, Ps4 and Xbox One will all deliver unique content which proves that they perform well at what they do.

Pc performance will always be potentially higher than consoles, but there is no guarantee that your machine will perform above your expectation, you know what you’re getting into and it will probably only ever preform to your expectations or eventually worse as your hardware becomes less main stream and developers begin to code on newer hardware.


Edited by gronik, 02 June 2013 - 04:53 AM.


#94 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 02 June 2013 - 05:08 AM

I really can't see the logic in discussing performance of consoles at the beginning of their life’s, I mean PS4 and Xbox One haven’t even been officially released yet!
  I think I remember when Sony originally unleashed details on the PS3 stating all kinds of numbers which didn't quite make it to release.

But; what did we get from the last generation of consoles?
Did we get a bunch of only rubbish games and wasted memories? Pfft only if you looked for them!
Ps3, Xbox 360 and Wii all had games which proved their respective worth.
  They all had games which stood out and showed that the consoles could perform above our expectations (or at least mine!)
There is no doubt in my mind that the Wii U, Ps4 and Xbox One will all deliver unique content which proves that they perform well at what they do.

Pc performance will always be potentially higher than consoles, but there is no guarantee that your machine will perform above your expectation, you know what you’re getting into and it will probably only ever preform to your expectations or eventually worse as your hardware becomes less main stream and developers begin to code on newer hardware.


I suggest you read past the first sentence or two if you intend to post a response. And you REALLY dont want to talk about the merits of this pile of steaming homogenous crap that was an excuse of a generation i had to suffer through for the past 7 years. But thats a different subject.

Anyways, you are right, power has nothing to do with game quality or fun. But thats not what this thread is about. Its about power.

and if you continued reading, youd find out punk mistakenly thought the cpu in ps4/xbone was an 8 core piledriver, amd's most powerful line of cpu's meant to compete with i7's.

When in reality, the cpu is a jaguar, amd's netbook cpu's that get spanked by low end i5's, and even some i3's.

You dont need the system to be released to understand the implications of that much of a difference in power.

banner1_zpsb47e46d2.png

 


#95 MorbidGod

MorbidGod

    Hammer Bro.

  • Members
  • 1,717 posts

Posted 02 June 2013 - 08:54 AM

The reason I don't respond to many of your threads is you simply have no connection with reality, you are the nutter on the bus that most people avoid.

However just to make this super clear I will try to make this super easy for someone even like yourself to follow;

First of all integer performance is dealing with 0 and 1's only its much easier for a cpu to deal with integer performance than floating point so its something the spe's do to a higher level than floating point. This is nothing hard to understand. A nice summary of integer peformance for various cpu's is here;

http://en.wikipedia....ions_per_second

For ps3 they state;

PS3 Cell BE (PPE only) 10,240 MIPS at 3.2 GHz (not including the seven spe's (six for games)

Each spe has a theoretically maximum of 25.4gflops each and the spe can do 2 integer operations for one single precision floating point operation. However the real world figure given for the cell processor in total is around 30,000 for integer operation. You only have to look below to see how much of the cell is dedicated to spe's to see its not going to be a low figure. Look at the size of it. One is mapped out to enable high yields. Considering just a small area of the cell is generating over 10,000 dmips who in their right mind wouldn't believe 30,000 dmips in total as a bare minimum. This is a figure given for real world performance I believe. I'm sure the theoretical integer performance would be much higher. The reality is with the cell though that there is a large divide between theoretical and real world figures but I have not used any theoretical figures. When the cell is fully optimised even with the ps3's weaker gpu it can surpasse the 360 by some margin. How would it do this is the cell wasn't significantly more powerful when its gpu and memory configuration is weaker? Clearly though the spe's can be used for either floating point and integer calcuations and most of the time floating point would be used to assist the ps3 graphically. Integer calculations may be used for logic and sound and of course the ps3's background operating system which is mainly integer.

Many games don't make use of the spe's at all so spe use is mainly integer for the operating system. Many just use it for sound processing which is often integer use in addition to the operating system. I remember reading that Orange box made no use of the spe's at all but then the pc engine was a bit dated anyway and ran well enough on the two threads of the PPE.

cell.jpg

xenons

Xbox360 IBM "Xenon" (Triple core) 19,200 MIPS at 3.2 GHz

(thats full xenon integer performance)

AMD FX-8150 (Eight core) 108,890 MIPS at 3.6 GHz

This is not jaguar cores but the earlier cores, the jaguar performance is in the range of 10-40% better.

So this would be 48,000 dmips at 1.6ghz and then depending how much you add for jaguar performance. It's at least 10% more I believe you end up with a figure of a minimum of 50,000 plus. Obviously there are some sources with slight variation of these figures. The figures I gave before actually seem a bit conservative but are in the ballpark figure.

Wii u performance is easy its already been discovered that core 0 operates only for wii mode and all cores are the same and are as per the original gamecube and wii cpu's but with more cache (needed for the low memory bandwidth) so you have a figure of about 8,400 if you use the powerpc 750 from that list which is basically the architecture for gekko, broadway and expresso. This figure may be conservative but it will be well within 10,000 dmips. I think 8,800 was given on neogaf by someone.

Can you please stop writing your utter tripe about wii u cpu performance, it has absolutely no connection with reality. The AMD jaguar's cpu performance may not be amazing but properly written multi-threaded games optimised for the cpu and the fact the gpu's in the ps4/xbone massively assist the cpu anyway create a serious jump in performance over wii u/360 and ps3. The wii u's cpu is inferior to the 360 and PS3 and only because of the wii u's gpu does it offer comparable performance to those consoles although currently the wii u is weaker on average most of the time.

Again what is the point of your posts, they bear absolutely no relationship to how the wii u is performing. It's no point arguing with me until the wii u starts performing to a higher level than it currently is. Even with its more advanced gpu that assists the cpu the wii u is struggling not to drop frames. The idea that the wii u cpu is competitive with ps4/xbone is so utterly ridiculous when its struggling against near end of life consoles.


You do realize you are using Wikipedia, a source that is very unreliable and can't be used as a source at any university, and he used official documentation, right? I'm just saying.
Whovian12 -- Nintendo Network ID.

#96 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 02 June 2013 - 09:07 AM

You do realize you are using Wikipedia, a source that is very unreliable and can't be used as a source at any university, and he used official documentation, right? I'm just saying.


That doesnt even matter. He cant even use wikipedia because his knowledge base is so lacking.

He thinks raw integer performance=dmips, not understanding it takes thousands of integer operations to complete a single dhrystone. Its a synthetic benchmark that simulates a realworld program. HE thinks one integer operation=dmip.... hence is not goodly inflated/made up numbers.

But more hilarious than that, he thinks jaguar (ps4/xbones cpu) is amd's pilediver, successor to bulldozer, amd's highest end cpu meant to compete with i7, which is the processor he linked to on wiki. When in actuality, jaguar is the successor to BOBCAT amd's low end/low power processor line used for net books and laptops.

banner1_zpsb47e46d2.png

 


#97 Brando67854321

Brando67854321

    Cheep-Cheep

  • Banned
  • 126 posts

Posted 02 June 2013 - 09:35 AM

No ones going to port Unreal Engine 4 so yes it is underpowered. 



#98 NintendoReport

NintendoReport

    NintendoChitChat

  • Moderators
  • 5,907 posts
  • NNID:eddyray
  • Fandom:
    Nintendo Directs and Video Presentations

Posted 02 June 2013 - 09:45 AM

No ones going to port Unreal Engine 4 so yes it is underpowered. 

 

LOL, yes and 2+2 = 3


Keep Smiling, It Makes People Wonder What You Are Up To!
PA Magician | Busiest PA Magician | Magician Reviewed | Certified Magic Professionals

nccbanner_by_sorceror12-d9japra.png-- nintendoreportbox.png -- nintendo_switch_logo_transparent___wordm

#99 Brando67854321

Brando67854321

    Cheep-Cheep

  • Banned
  • 126 posts

Posted 02 June 2013 - 09:46 AM

LOL, yes and 2+2 = 3

Tell me WHO is going to port that to wii u? HUH?



#100 Brando67854321

Brando67854321

    Cheep-Cheep

  • Banned
  • 126 posts

Posted 02 June 2013 - 09:48 AM

LOL, yes and 2+2 = 3

Unreal Engine 3 is too old & Xbox one & ps4 will be the only 8th gen consoles to support UE4 






1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users

Anti-Spam Bots!