Jump to content


Photo

PotatoHog's daily lesson on the Wii U CPU/GPU situation


  • Please log in to reply
64 replies to this topic

#21 esrever

esrever

    Paragoomba

  • Members
  • 20 posts

Posted 22 November 2012 - 11:34 PM

Interesting food for thought. I'd like to revisit the topic of ports when the reverse happens. Let's see what happens when a game is made from the ground up on the Wii U and then ported over to the PS3/X360. If the Wii U is no more powerful than the X360/PS3, then the differences should be pretty insignificant.

The difference would probably in the resolution and textures but generally, the game play would be the same and differences won't be extreme. Would be like comparing an early ps3 and 360 port. If the CPU is up to the task, the gpy load can be easily lowered to compensate.

#22 BrosBeforeGardenTools

BrosBeforeGardenTools

    Hot dog vendor who spilled condiments on himself

  • Members
  • 1,864 posts

Posted 23 November 2012 - 04:03 AM

Goose - The PS4 has a dedicated GPU.

#23 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 23 November 2012 - 04:40 AM

I thought it was leaked they had a GPU in addition to the A10?
Just thought of something else. Everyone is rainbowting all over the CPU of the Wii U and watch how they are going to switch when it's announced the PS4 isn't even going to have a separate CPU.
Fanboys man...


No, not confirmed, just concerned fans saying it HAD to have one after seeing the devkits less than supernova powered a10. But if it had a second gpu... It probably would have been mentioned with the a10.

banner1_zpsb47e46d2.png

 


#24 Goose

Goose

    Bullet Bill

  • Members
  • 380 posts

Posted 23 November 2012 - 06:33 AM

Goose - The PS4 has a dedicated GPU.

I'm saying it's CPU centric rather than GPU. Not saying it doesn't have one.

Although the PS4 is rumored to have the A10 APU instead of a separate CPU and GPU.


No, not confirmed, just concerned fans saying it HAD to have one after seeing the devkits less than supernova powered a10. But if it had a second gpu... It probably would have been mentioned with the a10.

Hm that's interesting. I know it's powerful to some extent, i
ve seen it playing Battlefield 3 and Sleeping Dogs, so it should have a noticeable improvement at least. So in terms of raw power it's shaping up to be Wii U<PS4<Xbox3?

#25 scotty79

scotty79

    Bob-omb

  • Members
  • 295 posts
  • Fandom:
    bf3,cod and timesplitters 2!

Posted 23 November 2012 - 09:07 AM

I think lots of people on here are under estimating A10 (likely customized) graphics.For instance I had an a8 3500m APU laptop and although it's computing power wasn't great (low i3) the graphics were excellent able to run civilisation 5 at high settings.This was only one example I know but that was a mobile last gen apu no doubt Sony will have a newer desktop part.This I feel is a smart choice for Sony probably a lot cheaper than.the equivalent Nvidia gpu.Be interesting to see where Microsoft go as well.
Posted Image

#26 Goose

Goose

    Bullet Bill

  • Members
  • 380 posts

Posted 23 November 2012 - 11:51 AM

I think lots of people on here are under estimating A10 (likely customized) graphics.For instance I had an a8 3500m APU laptop and although it's computing power wasn't great (low i3) the graphics were excellent able to run civilisation 5 at high settings.This was only one example I know but that was a mobile last gen apu no doubt Sony will have a newer desktop part.This I feel is a smart choice for Sony probably a lot cheaper than.the equivalent Nvidia gpu.Be interesting to see where Microsoft go as well.

A lot smaller, less power consumption, plus everything you just said. APU might become the new standard in computing. Of course hardcore PC gamers will continue to get dedicated GPUs and CPUs, but still.

#27 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 23 November 2012 - 12:09 PM

I think lots of people on here are under estimating A10 (likely customized) graphics.For instance I had an a8 3500m APU laptop and although it's computing power wasn't great (low i3) the graphics were excellent able to run civilisation 5 at high settings.This was only one example I know but that was a mobile last gen apu no doubt Sony will have a newer desktop part.This I feel is a smart choice for Sony probably a lot cheaper than.the equivalent Nvidia gpu.Be interesting to see where Microsoft go as well.


No, its not under estimating the a10, they know what its capable of and its not enough for them, because they have psychotic unmeetable expectations for these systems.

banner1_zpsb47e46d2.png

 


#28 esrever

esrever

    Paragoomba

  • Members
  • 20 posts

Posted 23 November 2012 - 12:22 PM

The A10 is about as fast as the Wii u's gpu if they re-engineer the memory interface which they would have to for shared memory on a console.

#29 Goose

Goose

    Bullet Bill

  • Members
  • 380 posts

Posted 23 November 2012 - 12:36 PM

No, its not under estimating the a10, they know what its capable of and its not enough for them, because they have psychotic unmeetable expectations for these systems.

This guy.

Sony drones are expecting some master race system with games running at 4k resolution. There will not be a jump like with previous generations.

#30 esrever

esrever

    Paragoomba

  • Members
  • 20 posts

Posted 23 November 2012 - 03:13 PM

This guy.

Sony drones are expecting some master race system with games running at 4k resolution. There will not be a jump like with previous generations.

sony said themselves that the ps4 aims at 1080p 60fps for all games.

#31 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 23 November 2012 - 03:38 PM

sony said themselves that the ps4 aims at 1080p 60fps for all games.


"1080p 3d 60fps no problem" is the exact quote"

We already know this, which is why we are talking about the TONS of Sony fans weve seen whos expectations are WAY too high, as they expect 4k resolutions as, and i quote 'bare minimum'.

I hate these kinds of people. It doesnt matter what system they are all the same. They have ridiculous expectations that completely defy what all signs are pointing to, you tell them their expectations are unmeetable, you show them the direction and what to expect, and they ignore it, quoting whatever baseless rumour reinforces their preconcieved notion. Then comes the crash, when theyre not good expectations ARENT met, and then you have an annoying whiny disillusioned troll to deal with.

banner1_zpsb47e46d2.png

 


#32 Keviin

Keviin

    Lakitu

  • Members
  • 2,270 posts
  • Fandom:
    Zelda, Mario, Metroid, Resident Evil

Posted 23 November 2012 - 03:50 PM

I just want to know if it could run E3s famous Zelda Experience in full HD 60fps..
No sig.

#33 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 23 November 2012 - 04:12 PM

I just want to know if it could run E3s famous Zelda Experience in full HD 60fps..


Well, the V1 was running the e3 Zelda demo. I believe 720p iirc. I dont recall ever hearing about the framerate.

And we are on v5 now, and most devs who stuck it out through v1, and the compaability breaks began changing their negative tune with the release of v4, which was much more powerful than the v1/2 devkits.

So yeah, Im pretty sure the Zelda Demo that took place in an enclosed area with only link, navi, and one spider can be done in 1080p. But personally, Id rather take that extra 200% in fill rate needed to go from 720 to 1080 and use it on some pretty post op effects like atmosphere and attention to detail.

Regaurdless, the real Zeldas always trashed the tech demoes, so whatever we do end up seeing, is going to smoke the e3 demo.

banner1_zpsb47e46d2.png

 


#34 Alianjaro

Alianjaro

    Pokey

  • Members
  • 1,317 posts
  • Fandom:
    Monster Hunter, Legend of Zelda

Posted 23 November 2012 - 05:15 PM

I'm so happy I just learned something!
Posted Image

#35 Structures

Structures

    LIL B's SiDEKiCK

  • Members
  • 462 posts
  • NNID:Structures
  • Fandom:
    Gundam, Nintendo, Warcraft, Guild Wars

Posted 23 November 2012 - 07:24 PM

most games are more about the gpu than cpu, im sure the cpu is enough

636416_0.jpg
Youtube Twitch.tv I stream Wii U Gameplay and PC Games


#36 Keviin

Keviin

    Lakitu

  • Members
  • 2,270 posts
  • Fandom:
    Zelda, Mario, Metroid, Resident Evil

Posted 24 November 2012 - 04:28 AM

Well, the V1 was running the e3 Zelda demo. I believe 720p iirc. I dont recall ever hearing about the framerate.

And we are on v5 now, and most devs who stuck it out through v1, and the compaability breaks began changing their negative tune with the release of v4, which was much more powerful than the v1/2 devkits.

So yeah, Im pretty sure the Zelda Demo that took place in an enclosed area with only link, navi, and one spider can be done in 1080p. But personally, Id rather take that extra 200% in fill rate needed to go from 720 to 1080 and use it on some pretty post op effects like atmosphere and attention to detail.

Regaurdless, the real Zeldas always trashed the tech demoes, so whatever we do end up seeing, is going to smoke the e3 demo.


Yeah, the next Zelda is more likely to be cell-shaded than what we saw, looking back at the first SS artwork and Spaceworld demo. But I'm happy if the console could run something as beautiful as it.
No sig.

#37 Arkhandar

Arkhandar

    Dry Bones

  • Members
  • 479 posts
  • Fandom:
    Zelda, Metroid, Mario, Kirby, DK

Posted 24 November 2012 - 07:50 AM

Well, the V1 was running the e3 Zelda demo. I believe 720p iirc. I dont recall ever hearing about the framerate.

And we are on v5 now, and most devs who stuck it out through v1, and the compaability breaks began changing their negative tune with the release of v4, which was much more powerful than the v1/2 devkits.

So yeah, Im pretty sure the Zelda Demo that took place in an enclosed area with only link, navi, and one spider can be done in 1080p. But personally, Id rather take that extra 200% in fill rate needed to go from 720 to 1080 and use it on some pretty post op effects like atmosphere and attention to detail.

Regaurdless, the real Zeldas always trashed the tech demoes, so whatever we do end up seeing, is going to smoke the e3 demo.

It would be really awesome to know where all of you guys are sourcing those kinds of statements. We already know that Wii U devkits are V5 final from "IdeaMan" - NEOGAF, but where the hell are you getting these types of info, such as:
  • "Zelda Demo running on V1" - Seriously, 5 revisions in less then a year? V5 shipped in Febuary 2012, Zelda Demo was probably made from March to May 2011. Something's wrong here.
  • "V1/V2 and V4/V5 no compatibility" - This one is the most amusing. Great to know you're developing a game that's not going to work on final hardware.
  • "Wii U has 32MB eDRAM" - Not directed to you but where the hell are these guys getting these numbers? The max amount of eDRAM a POWER7 CPU could possibly have is 32MB if it had 8 cores (4MB per core) and I don't see Wii U having 8 cores giving the CPU's size.
  • "Wii U has crappy PowerPC Broadway-fied 3-cores" - I'm not even going to comment on this one.
  • Wii U "has a horrible, slow CPU" - Slow? Yes (clocked less than 3.2GHz). Horrible? Just no (less clock speed doesn't mean less power).
Please make sure you carefully analyse Wii U specs that are posted by some random forum posters that just don't know what you are talking about. The only guy we can trust right now about real specs is IdeaMan, the rest is just made up crap.

Edited by Arkhandar, 24 November 2012 - 07:52 AM.

If you try to fail and succeed, which have you done?

Posted Image

#38 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 24 November 2012 - 08:43 AM

It would be really awesome to know where all of you guys are sourcing those kinds of statements. We already know that Wii U devkits are V5 final from "IdeaMan" - NEOGAF, but where the hell are you getting these types of info, such as:

  • "Zelda Demo running on V1" - Seriously, 5 revisions in less then a year? V5 shipped in Febuary 2012, Zelda Demo was probably made from March to May 2011. Something's wrong here.
  • "V1/V2 and V4/V5 no compatibility" - This one is the most amusing. Great to know you're developing a game that's not going to work on final hardware.
  • "Wii U has 32MB eDRAM" - Not directed to you but where the hell are these guys getting these numbers? The max amount of eDRAM a POWER7 CPU could possibly have is 32MB if it had 8 cores (4MB per core) and I don't see Wii U having 8 cores giving the CPU's size.
  • "Wii U has crappy PowerPC Broadway-fied 3-cores" - I'm not even going to comment on this one.
  • Wii U "has a horrible, slow CPU" - Slow? Yes (clocked less than 3.2GHz). Horrible? Just no (less clock speed doesn't mean less power).
Please make sure you carefully analyse Wii U specs that are posted by some random forum posters that just don't know what you are talking about. The only guy we can trust right now about real specs is IdeaMan, the rest is just made up crap.

It would be really awesome to know where all of you guys are sourcing those kinds of statements. We already know that Wii U devkits are V5 final from "IdeaMan" - NEOGAF, but where the hell are you getting these types of info, such as:

  • "Zelda Demo running on V1" - Seriously, 5 revisions in less then a year? V5 shipped in Febuary 2012, Zelda Demo was probably made from March to May 2011. Something's wrong here.
  • "V1/V2 and V4/V5 no compatibility" - This one is the most amusing. Great to know you're developing a game that's not going to work on final hardware.
  • "Wii U has 32MB eDRAM" - Not directed to you but where the hell are these guys getting these numbers? The max amount of eDRAM a POWER7 CPU could possibly have is 32MB if it had 8 cores (4MB per core) and I don't see Wii U having 8 cores giving the CPU's size.
  • "Wii U has crappy PowerPC Broadway-fied 3-cores" - I'm not even going to comment on this one.
  • Wii U "has a horrible, slow CPU" - Slow? Yes (clocked less than 3.2GHz). Horrible? Just no (less clock speed doesn't mean less power).
Please make sure you carefully analyse Wii U specs that are posted by some random forum posters that just don't know what you are talking about. The only guy we can trust right now about real specs is IdeaMan, the rest is just made up crap.


Since you put so much faith in ideaman go back and look through his posts, he also talks about devs dealing with compatability breaks between versions and how they handled it. (Some started over, some waited for some compatability to be restored via future version releases).

There is nothing strange about so many versions within a year. Not all of them are hardware changes. Game cubes fourth revision came a scant few months before launch, increasing cpu speed nearly 2x.

Hilariously, the previous sdk version is still compatable with most all release games and the god (previous cube sdks are not) so you can put retail god's in the unit and play them.... At half speed. Theres a youtube video you can check out. Slow mo sunshine, pretty funny.

I get my version release timeline from RAD gametools,

http://www.radgametools.com/

usually MILES development history timeline, which states everytime they get a new sdk. For example they switched to sdk v2.04 on 5-4-2012.

Changes from 9.1d to 9.1e (5-4-2012)

"Fixed a subtle bug on iOS where the background threads were sleeping too little - big performance increase!

Switched to Wii-U SDK 2.04.

Fixed a few bugs in the Miles Sound Studio"



32MB edram has been confirmed on the gpu. CPU confusion comes from ibm press release 'a lot of edram' rumour, and allusions to power 7 based technology (which has an on die 32Mb l3 edram cache).

All confusion can be cleared by original IBM press release, which confirms IBM's proprietary edram technology used in Power 7's l1 and l2 caches, within the processor core. Less than 1 Mb combined, but a very important technology nonetheless.

I get my information from official IBM documentation which i have attached to this forum for any and all to download and currently has.... Two downloads. Good job guys.


If you want to know where I get my info ASK.

Edited by 3Dude, 24 November 2012 - 09:03 AM.

banner1_zpsb47e46d2.png

 


#39 Arkhandar

Arkhandar

    Dry Bones

  • Members
  • 479 posts
  • Fandom:
    Zelda, Metroid, Mario, Kirby, DK

Posted 24 November 2012 - 09:43 AM

Since you put so much faith in ideaman go back and look through his posts, he also talks about devs dealing with compatability breaks between versions and how they handled it. (Some started over, some waited for some compatability to be restored via future version releases).

There is nothing strange about so many versions within a year. Not all of them are hardware changes. Game cubes fourth revision came a scant few months before launch, increasing cpu speed nearly 2x.

Hilariously, the previous sdk version is still compatable with most all release games and the god (previous cube sdks are not) so you can put retail god's in the unit and play them.... At half speed. Theres a youtube video you can check out. Slow mo sunshine, pretty funny.

I get my version release timeline from RAD gametools,

http://www.radgametools.com/

usually MILES development history timeline, which states everytime they get a new sdk. For example they switched to sdk v2.04 on 5-4-2012.

Changes from 9.1d to 9.1e (5-4-2012)

"Fixed a subtle bug on iOS where the background threads were sleeping too little - big performance increase!

Switched to Wii-U SDK 2.04.

Fixed a few bugs in the Miles Sound Studio"



32MB edram has been confirmed on the gpu. CPU confusion comes from ibm press release 'a lot of edram' rumour, and allusions to power 7 based technology (which has an on die 32Mb l3 edram cache).

All confusion can be cleared by original IBM press release, which confirms IBM's proprietary edram technology used in Power 7's l1 and l2 caches, within the processor core. Less than 1 Mb combined, but a very important technology nonetheless.

I get my information from official IBM documentation which i have attached to this forum for any and all to download and currently has.... Two downloads. Good job guys.


If you want to know where I get my info ASK.

I did, and where has it been confirmed that the GPU exactly has a dedicated 32MB eDRAM pool? And would you mind posting you're "Wii U SDK version release timeline"?

Thanks.

Edited by Arkhandar, 24 November 2012 - 09:44 AM.

If you try to fail and succeed, which have you done?

Posted Image

#40 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 24 November 2012 - 09:59 AM

I did, and where has it been confirmed that the GPU exactly has a dedicated 32MB eDRAM pool? And would you mind posting you're "Wii U SDK version release timeline"?
Thanks.


I did, its in the miles section of the rad games tool site I linked to.

Here easier:

http://www.radgametools.com/msshist.htm

This one only goes to the beginning of september though. And yeah, its obvious nintendo prioritizes some devs over others. Shin en and ubisoft and probably others like capcom obviously have the latest v5 devkits. Other devs who nintendo seems to deem less..... worthy, are likely still several versions behind, hence the whining and negative articles from some, and praise from others, like shin en and ubisoft.

32Mb embedded on gpu has been cross confirmed by multiple recognized and confirmed by neogaf staff as developers who have wii u devkits.

Its also absolutely necessary for wii bc as the 12.8 Gb a second bandwidth from main ram is insuffecient for wii games. (namely the bandwidth and latency of the 24Mb 1tsram.)

Edited by 3Dude, 24 November 2012 - 10:07 AM.

banner1_zpsb47e46d2.png

 





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

Anti-Spam Bots!