Jump to content


Photo

I think the PS4 specs are a lie


  • Please log in to reply
152 replies to this topic

#81 Plutonas

Plutonas

    Pokey

  • Members
  • 1,319 posts

Posted 10 March 2013 - 04:01 PM

i agree... the sram is registered as (mem1) and the 2 gigs are registered as (mem2).. the system runs through mem 1 right? and pools from mem 2... if it needs

 

xbox720 is the same thing!!!  32mb of SRAM and 8gigs of ddr3... they cant go more than 1800 (wii U is 1600?) in ddr3...


Edited by Plutonas, 10 March 2013 - 04:03 PM.


#82 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 10 March 2013 - 04:02 PM


Plutonas, on 09 Mar 2013 - 09:04, said:another problems pops up, reading the ps4 specs..
They all say... woow upi... 8 core they run like chitas in the jungle.. with 8 gigs for GDDR5... upi... lol
But there is basic principals between ddr3 and gddr5....  GDDR5 cannot be used (well) with cpu tasks and cpu cores at all!!!.... ddr3 is AWAY to better.
because gddr5 ram is designed for gpus and specific tasks much faster... CPU must be always linked to the ram for its executions... something gddr5 ram is not good at all... .. hmmm
So yes, many people who speak about bottlenecks.. its not only for the "8" cores... but also for the gddr5.  Maybe thats why ms goes with ddr3 instead.


--------------------------------

This is because gddr5 has very high latency compared to ddr3. Its typically mitigated by the fact gddr5 can be clocked much, much higher.

Hmmm... Do we know what the gddr5 is clocked at yet?

I've been trying to point this out, no one seems to understand what GDDR5 is and why you don't use that crap with general purpose processing, which is constantly accessing RAM to drop and fetch minuscule amounts of data.  The latency penalty you get with GDDR5 is staggering compared to DDR3, and if it was so good it would be available on a DIMM to drop in your PC.

 

Sony went the easy route, and rather than spend the money and R&D time on an efficient memory hierarchy, they just put the fastest unified RAM in there they could find.  Sure the bandwidth is going to be great, but at the cost of ridiculous latency for everything except GPU tasks.



#83 Plutonas

Plutonas

    Pokey

  • Members
  • 1,319 posts

Posted 10 March 2013 - 04:06 PM

thats why they gave lots of computing in ps4 gpu... they say " 1.8 in theory tflops) but if the gpu uses the half power for cpu tasks... is going to be like 1 tflop raw power, maybe less..

 

maybe sony gave 8 gigs of that... 2-4 for the gpu and the rest for easy tasks...  background... I cant explain it otherwise.  Remember the demo with the falling objects on ps4? They said... ps4 uses ehre 0% cpu power...

 

maybe because it has none.


Edited by Plutonas, 10 March 2013 - 04:07 PM.


#84 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 10 March 2013 - 04:08 PM


routerbad, on 10 Mar 2013 - 10:16, said:I've been trying to point this out, no one seems to understand what GDDR5 is and why you don't use that crap with general purpose processing, which is constantly accessing RAM to drop and fetch minuscule amounts of data.  The latency penalty you get with GDDR5 is staggering compared to DDR3, and if it was so good it would be available on a DIMM to drop in your PC.
Sony went the easy route, and rather than spend the money and R&D time on an efficient memory hierarchy, they just put the fastest unified RAM in there they could find.  Sure the bandwidth is going to be great, but at the cost of ridiculous latency for everything except GPU tasks.



Perhaps... I mean, they definately picked gddr5 for the high bandwidth so as to avoid pricey edram pools, but we need to find out if the ram is clocked high enough to offset gddr5's latency, and if it is/isnt by what factor, before coming to conclusions.

The info should present itself before too long.


banner1_zpsb47e46d2.png

 


#85 Plutonas

Plutonas

    Pokey

  • Members
  • 1,319 posts

Posted 10 March 2013 - 04:10 PM

interesting.. if u type on google:  gddr5 175gb/s  and u press enter... 

 

We get results for amd gpu ram.. at 1250mhz 2gb of gddr5 / 175gb/s and they give up to 2.2 tflops as an overall gpu...   so is going to be arround that.

 

http://www.legitrevi...article/1488/1/

 

 

so I believe the 175gb/s can give us the exact clocks of ram.


Edited by Plutonas, 10 March 2013 - 04:15 PM.


#86 kingdomcode

kingdomcode

    Shy Guy

  • Banned
  • 34 posts

Posted 10 March 2013 - 04:21 PM


does this mean we will se more games between wiiu and ps4 since the ps4 isnt that much more powerful the wiiu?



#87 Alex Atkin UK

Alex Atkin UK

    Boo

  • Members
  • 528 posts

Posted 10 March 2013 - 04:27 PM

I doubt Sony would throw GDDR5 in there if the latency was going to pose a problem.

 

As for why do PCs not have GDDR5 for system memory?  I don't think latency is the big issue, its more fact that a PC has no use for system RAM with that kind of bandwidth.  Its generally the GPU that needs high bandwidth and as all the good GPUs require hooking over PCIe, they wouldn't have access to the system RAM at full speed anyway which is why they have their own pool of GDDR5 memory on the card itself.

 

As PS4 is a fixed hardware design not needing to cater to the wildly varied PC workloads, it can go at the problem from a completely different angle to what PCs do.  Both the CPU and GPU able to access that single pool of memory directly in a useful fashion and as you do not have to worry about a bloated desktop OS running in the background, any latency issues can be mitigated. 

 

A big bottleneck on PC has always been having to copy data from system RAM into GPU RAM before the GPU can do anything useful with it.  The PS4 would seem to eradicate that problem entirely.

 

There is every possibility that in fact PCs WOULD be better designed like the PS4.  But in doing so you limit the user to the graphics chip and RAM that came with their PC, when the whole point of buying a PC is so you can pick and choose.

 

Yes the RAM could be modular, but the standards for DIMMS are set based on what is logical for the market as a whole, not just the small segment that is gaming.  Can you imagine how much it would cost if they had to make game-specific chipsets that supports GDDR5 DIMMS?  Then the cost of those DIMMS themselves as they wouldn't be sold in the same quantities as DDR3 so would cost many times greater?


Edited by Alex Atkin UK, 10 March 2013 - 04:31 PM.

Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK

 

How to improve the Wii U download speed.


#88 Plutonas

Plutonas

    Pokey

  • Members
  • 1,319 posts

Posted 10 March 2013 - 04:30 PM

sony cannot surpass what pcs can do... they just borrowed some pc tech..  If gddr5 could handle cpus, pc was about to be the first one having it.

 

maybe ps4 is entirely gpu centric machine... cpu is for some physics because physics are NVIDIA property and amd gpus not allowed to do that.

 

what u said in ur msg (Alex), you mean sony invested money to elliminate the latency problem for gddr5?  sounds crazy

 

Amd is known that produces their own gddr5 and they used it originally with 4xxx series of cards... And ps4 will have lots of things going on...

 

live video decoding and streaming on psn profile, video decoding and streaming for the game, profile on facebook, tweeter, etc... OS, downloads, live camera (2 of them), etc...

 

@3dude   I found a conversation about ps4 ram at reddit...  They give 1.375ghz for it, being capable for 176gb/s 

 

but they say...if ps4 have in the same die the gpu and cpu, it will have problems... http://www.reddit.co...pecs_from_sony/


Edited by Plutonas, 10 March 2013 - 04:51 PM.


#89 Alex Atkin UK

Alex Atkin UK

    Boo

  • Members
  • 528 posts

Posted 10 March 2013 - 04:43 PM

All I said was that Sony wouldn't have used GDDR5 if it was going to cause performance problems.

 

In fact it looks like the PC WILL be getting GDDR5 support, its something AMD added to their desktop APU parts as well.  So this just brings us back to the fact Sony are using the cutting-edge AMD APU for the PS4.

 

So the reason why the PC doesn't (yet) use GDDR5 is CLEARLY down to cost and chip density.  Up until now there just hasn't been a good reason for it as its only really any useful for GPUs and until recently none of the chips which combined the CPU and GPU were capable enough to need it.

 

People seem to forget that when the early DDR3 boards came out, they too were slower than the same CPU on a DDR2 board as while the bandwidth went up, the latency did too.  So until CPU was fast enough to run start being hampered by the bandwidth of DDR2, there wasn't a good reason to upgrade. 

 

Likewise today its only certain use cases that really would benefit enough from better than DDR3 memory in order to rationalise the switch.


Edited by Alex Atkin UK, 10 March 2013 - 04:49 PM.

Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK

 

How to improve the Wii U download speed.


#90 Plutonas

Plutonas

    Pokey

  • Members
  • 1,319 posts

Posted 10 March 2013 - 04:52 PM

if sony didnt wanted problems, they wouldnt choose cell from the first place... lol Sony ps4 does not have APU.. but cpu and gpu on the same die.

 

And you wrong about pc... pc soon will get ddr4... They wouldnt jump to ddr4, but gddr5 instead, if everything was ok...


Edited by Plutonas, 10 March 2013 - 04:55 PM.


#91 Alex Atkin UK

Alex Atkin UK

    Boo

  • Members
  • 528 posts

Posted 10 March 2013 - 04:55 PM

LOL that is a good point, but its clear that this time around they realised they cannot keep doing things how they did before.

 

Both the PS2 and PS3 were a royal PITA to get the best out of and as costs are increasing for developers, they couldn't rely on them putting the effort in to get the best from stupidly complex hardware anymore.

 

Just look at all the BS from certain developers about the Wii U.  We know its capable, but in some ways its like the PS3 as it needs developers to really learn the intricacies of the hardware to get the CPU to work even as well as Xbox 360.

 

I'm not saying its anywhere near as hard as the PS3, it clearly isn't, but in the current financial climate and the doubt of any new console selling in a high enough quantity to warrant supporting it - they needed an advantage.


Edited by Alex Atkin UK, 10 March 2013 - 05:00 PM.

Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK

 

How to improve the Wii U download speed.


#92 Plutonas

Plutonas

    Pokey

  • Members
  • 1,319 posts

Posted 10 March 2013 - 04:57 PM

I agree... thats why they try something different... graphics ram experiments.. haha

 

kidding...

 

ps: As I said, I believe most of the game cpu tasks, will come out from the gpu instead... They advertised that. But dont expect 1.8 tflops performance...


Edited by Plutonas, 10 March 2013 - 05:00 PM.


#93 Alex Atkin UK

Alex Atkin UK

    Boo

  • Members
  • 528 posts

Posted 10 March 2013 - 05:03 PM

Like I have said before, I think as games come to Xbox Next which based on the rumours is VERY similar in design to the Wii U (relying on 32MB GPU RAM and DDR3 system RAM) that the Wii U will gain more support as it should be easier to port from next-gen than current.

 

However I am still concerned that in a year or two, games will be taking advantage more of the much larger RAM and GPU power on the PS4/Xbox and Wii U may suffer then. 

 

On the up side, Nintendo should pumping out gamers by then and hopefully some third parties will be writing more the Wii Us own strengths rather than relying on ports.  So its not all bad.


Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK

 

How to improve the Wii U download speed.


#94 cannonshane

cannonshane

    Piranha Plant

  • Members
  • 925 posts
  • Fandom:
    Luigi

Posted 10 March 2013 - 06:23 PM

I do not care for specs on paper, give me the console running a game in front of me then I will make judgement, it's easy for people to sit there and sh1t bag how good or bad a console is by looking at a specs list, but tell me this how do you measure fun in specs???

Edited by cannonshane, 10 March 2013 - 06:24 PM.

Staff Writer at http://www.allagegaming.com/

 

Strayaaaaaaaaaa Mate


#95 Robotic Sunshine Commander

Robotic Sunshine Commander

    Pokey

  • Members
  • 1,350 posts

Posted 10 March 2013 - 07:34 PM

On consoles.  Most game engines on PC are GPU centered.  

 

Wii U does have a co-processor and audio DSP, that much we can see on the die.

 

So that would explain why its so easy to port the pc versions of games over to Wii U. Like NFS most wanted.large.jpg



thats why they gave lots of computing in ps4 gpu... they say " 1.8 in theory tflops) but if the gpu uses the half power for cpu tasks... is going to be like 1 tflop raw power, maybe less..

 

maybe sony gave 8 gigs of that... 2-4 for the gpu and the rest for easy tasks...  background... I cant explain it otherwise.  Remember the demo with the falling objects on ps4? They said... ps4 uses ehre 0% cpu power...

 

maybe because it has none.

Interesting concept, you have a point there.


Signature_Fox.png


#96 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 10 March 2013 - 09:55 PM

I doubt Sony would throw GDDR5 in there if the latency was going to pose a problem.

 

As for why do PCs not have GDDR5 for system memory?  I don't think latency is the big issue, its more fact that a PC has no use for system RAM with that kind of bandwidth.  Its generally the GPU that needs high bandwidth and as all the good GPUs require hooking over PCIe, they wouldn't have access to the system RAM at full speed anyway which is why they have their own pool of GDDR5 memory on the card itself.

 

As PS4 is a fixed hardware design not needing to cater to the wildly varied PC workloads, it can go at the problem from a completely different angle to what PCs do.  Both the CPU and GPU able to access that single pool of memory directly in a useful fashion and as you do not have to worry about a bloated desktop OS running in the background, any latency issues can be mitigated. 

 

A big bottleneck on PC has always been having to copy data from system RAM into GPU RAM before the GPU can do anything useful with it.  The PS4 would seem to eradicate that problem entirely.

 

There is every possibility that in fact PCs WOULD be better designed like the PS4.  But in doing so you limit the user to the graphics chip and RAM that came with their PC, when the whole point of buying a PC is so you can pick and choose.

 

Yes the RAM could be modular, but the standards for DIMMS are set based on what is logical for the market as a whole, not just the small segment that is gaming.  Can you imagine how much it would cost if they had to make game-specific chipsets that supports GDDR5 DIMMS?  Then the cost of those DIMMS themselves as they wouldn't be sold in the same quantities as DDR3 so would cost many times greater?

You're right, no PC has a need for system RAM with great bandwidth but carp crap timings.  PC's would perform like crap if they came with GDDR RAM used on the main bus.  Think about the different types of operations a CPU and GPU perform, and why one would need RAM with the least amount of latency possible, and the other with the least amount of latency possible given a need for a super high frequency and voltage.  I can turn my system RAM into GDDR5 by turning the frequency way up and the timings way loose in the bios, but that would be very bad for performance, anyone would rather take the MHz hit for tighter timings.

 

The PS4 (and the XBOX720) are going to be running a LOT of non game code at all times, those tasks are not suited to their RAM setup.  No one is saying it isn't possible to run the system that way.  It can and will be done, it just isn't optimal.  DDR3 would perform MUCH better as system RAM.  I'm actually more impressed by the rumored 720 memory setup, if that SRAM is indeed embedded on die.  DDR3 would cost less and perform better for general system tasks.

 

The PS4 OS will in fact be running a lot of overhead in the background, even during gaming.  Latency will be an issue, but not one consumers will ever notice.  It's great to have mega bandwidth until you lose half of the bandwidth to wasted cycles because the CPU is sending fetch and park requests to RAM constantly making the GPU wait a little longer to use it.  It wasn't about Sony putting something in there that would be an issue, they went lazy on it, probably lowered the frequency considerably so they could tighten the timings a little because that can hurt game performance.



So that would explain why its so easy to port the pc versions of games over to Wii U. Like NFS most wanted.large.jpg



Interesting concept, you have a point there.

Right, and why the 360 ports on Wii U don't run very well and freeze on occasion.  They are CPU centered, and while the Wii U CPU holds up quite well given the system isn't designed for what they are doing, they would have seen much better performance if they took the time to do what Criterion did.  As it was, though, Nintendo wasn't there for a lot of the third party devs when they were doing the ports, and the tools weren't mature enough, and the hardware wasn't finalized until shortly before launch.



#97 Plutonas

Plutonas

    Pokey

  • Members
  • 1,319 posts

Posted 11 March 2013 - 01:18 AM

360 cpu is in order execution.... wii U is out of order...

 

xbox360 =   -----   (1-2-3-4-5)... it will read the code as it is...  (1-2-3-4-5-1-2-3-4-5-1-2-3-4-5).....  Now 360 code have  STOP signs all over the code.. so wii U cannot read out of order as it should be  (1-2-STOP-3-4-STOP-5-6-STOP)  stop meaning... that the cpu must read the first part first, it cant go... 1-5-2-5-1-4... it tries to peak the code between the signs and stops....

 

wii U = it gets the code (1-2-3-4-5)  but it can read it with the fastest possible way and without an order  ( 2-5-1-4-2-3-4-5-4-3-2-1-3-4-5-3)

 

so wii u with 360 code = (2-1-STOP-5-STOP-2-STOP-3-4-STOP) so it slows the cpu dramatically.... I hope u get the point.

 

The differense between in order and out of order is.... if the game requires the code " 5"   xbox must read first 1-2-3-4-5..... wii U goes str8 to 5.

 

 

ps: at least, thats how I understand it.


Edited by Plutonas, 11 March 2013 - 01:31 AM.


#98 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 11 March 2013 - 02:29 AM

I want to see a Quadrate with a 256 bit bus on ps4... that would do some work. Not looking likely....

What size is each chip? It cant be standard gddr5 @256 MB a chip or even 512, that would require a ridiculous messy setup for 8GB.

Edited by 3Dude, 11 March 2013 - 03:24 AM.

banner1_zpsb47e46d2.png

 


#99 Pjsprojects

Pjsprojects

    Chain Chomp

  • Members
  • 681 posts
  • Fandom:
    BF4-pc,GTA-360,Splinter cell-pc

Posted 11 March 2013 - 07:29 AM

360 cpu is in order execution.... wii U is out of order...

 

xbox360 =   -----   (1-2-3-4-5)... it will read the code as it is...  (1-2-3-4-5-1-2-3-4-5-1-2-3-4-5).....  Now 360 code have  STOP signs all over the code.. so wii U cannot read out of order as it should be  (1-2-STOP-3-4-STOP-5-6-STOP)  stop meaning... that the cpu must read the first part first, it cant go... 1-5-2-5-1-4... it tries to peak the code between the signs and stops....

 

wii U = it gets the code (1-2-3-4-5)  but it can read it with the fastest possible way and without an order  ( 2-5-1-4-2-3-4-5-4-3-2-1-3-4-5-3)

 

so wii u with 360 code = (2-1-STOP-5-STOP-2-STOP-3-4-STOP) so it slows the cpu dramatically.... I hope u get the point.

 

The differense between in order and out of order is.... if the game requires the code " 5"   xbox must read first 1-2-3-4-5..... wii U goes str8 to 5.

 

 

ps: at least, thats how I understand it.

Learnt something with that, ta!


Posted Image

Add me on Miiverse !! I'm from England but the world is a lot smaller online!

#100 MorbidGod

MorbidGod

    Hammer Bro.

  • Members
  • 1,717 posts

Posted 12 March 2013 - 12:22 AM

I think the easiest way to work this out is again using the memory bandwidth of the wii u this is a known exact figure. It is 12.8GB/s compared to 176GB/s for PS4. 360 and PS3 are in the 20-25GB/s territory. This memory represents the total amount of traffic that can go between the gpu, cpu and memory. If you want to make ridiculous claims that the wii u cpu is powerful then you do so at the expense of the gpu. We really don't know the exact performance of the wii u gpu or cpu but we know how much memory bandwidth it has available to share between them and that information allows us to gauge real world performance. Yes there is cache in the cpu and the gpu has 32MB of fast video memory but neither of these will magically get around that 12.8GB/s memory bandwidth figure.

Given those memory bandwidth restraints we can say the slightly more modern gpu architecture which relies slightly less on cpu resources will take the maximum chunk of memory bandwidth especially as it has 32MB of high speed video memory. This leaves minimal amount of bandwidth left over for the cpu which probably performs in the 60-80% area of performance compared to 360/PS3. Overall the wii u is current gen performance. However someone else may believe the cpu actually performs to a higher level at the expense of gpu performance which is possible as long as you live within the restrictions of 12.8GB/s. What is not reasonable or logical or based on any realistic viewpoint is to continue pretending that a console with 12.8GB/s bandwidth has comparable cpu and gpu performance to a console making use of 176GB/s. Clearly the PS4 has something like 15 times the memory bandwidth and it is highly unlikely that bandwidth will not be utilised. Especially with an 8 core cpu and multicore gpu.

Any argument stating the wii u being comparable to ps4 in performance has to first deal with the 12.8GB/s issue otherwise the argument is utterly pointless. You need to come up with an explanation on how that is possible. For reference the memory bandwidth difference between the original playstation 1 and playstation 2 is 24x the memory bandwidth and that was a huge generational jump.

I'm totally and utterly convinced the wii u is a current gen console in technogy. The specification supports it, game performance supports it, the developer leaks support it. All avenues of information support it.


Where are you getting those memory numbers? I must have missed something.


does this mean we will se more games between wiiu and ps4 since the ps4 isnt that much more powerful the wiiu?


All it means is developers can give us games. They like making excuses to not do it though. So we'll see.
Whovian12 -- Nintendo Network ID.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

Anti-Spam Bots!