[Photo] Wii U GPU Die
#121
Posted 06 February 2013 - 10:36 AM
#122
Posted 06 February 2013 - 12:07 PM
#123
Posted 06 February 2013 - 03:05 PM
eDRAM - 40nm
Other logic - 55nm
That's not even possible!
#124
Posted 06 February 2013 - 03:07 PM
Ok, this is getting funny. Guys on B3D forum are talking about cross technology layout.
eDRAM - 40nm
Other logic - 55nm
That's not even possible!
Chipwrks Jim Morrison flat out stated it was 40nm.
Beyond 3d aint what it used to be mate.
#125
Posted 06 February 2013 - 03:26 PM
#126
Posted 06 February 2013 - 04:04 PM
Chipwrks Jim Morrison flat out stated it was 40nm.
Beyond 3d aint what it used to be mate.
Its been that way for a LONG time. Since the beginning of their Wii U discussion they have gone out their way to try and put down the console and anyone who supports it.
#128
Posted 06 February 2013 - 04:49 PM
#129
Posted 06 February 2013 - 04:50 PM
*Comes out of his hole (again)*
I tend to forget some people take this stuff a lot more serious than I do.
First a thanks to Chipworks for going above and beyond for the picture and to blu, Durante, Fourth Storm, Thraktor, and wsippel for the work they did. Shinjohn let me know that the picture had been obtained and sent me a link, but I also checked out the thread. I wanted to come back and help with the confusion and what not.
As some of you know getting info about the hardware was a pain because what Nintendo released essentially boiled down to a features list. And by that I mean general features of a modern GPU that could easily be looked up. Info that dealt with performance apparently was not given out leaving devs to figure have to figure it out on their own. I had two working ideas of the GPU based on a more traditional design (which I was hoping for) and a non-traditional design. I see that some of you actually remembered the non-traditional idea. Wsippel and I would compare notes on whatever info we could come up with. Some of those notes led us to come up with how it may look if Nintendo took the non-traditional route.
http://www.neogaf.com/forum/showpost...ostcount=12053
In this post you’ll see both wsippel’s take and my take. I’m going to address some things in that post because I know some of you will try to take them out of context. First you’ll see wsippel’s baseline ended up being more accurate than mine. When I talked about the potential performance of 1TF or more that was in comparison to the R700 series because new GPUs are more efficient than that line, a higher baseline, and my idea focused on the dedicated silicon handling other performance tasks.
So what was the basis for the non-traditional view? I shared two of those bits of info before.
http://www.neogaf.com/forum/showpost...postcount=6136
Quote:
Well, I can't reveal too much. The performance target is still more or less the same as the last review from around E3. Now it's more balanced and "2012" now that it's nearer to complete and now AMD is providing proper stuff. As far as specs, I don't see any big change for better or worse, other than said cost/performance balance tweaks... It won't make a significant difference to the end user. As far as the kit goes, it's almost like what MS went through. Except more Japanese-ish... If you know what I mean. http://www.neogaf.com/forum/showpost...postcount=6305
Quote:
Anyway, things are shaping up now with the new year. There was some anxiety with some less close third parties about what they were doing with GPU side, whether things were going to be left in the past... but it looks more modern now. You know, there simply wasn't actual U GPU data in third party hands this time last year, just the target range and R700 reference GPU for porting 360 titles to the new cafe control. Maybe now they finally can get to start debugging of the specifics and start showing a difference... Here is one more specific piece that I didn’t fully share.
Quote:
I can't confirm or deny, sorry. The cat is very confidential and I repeat non-final. The target, last checked, is triple core with XX eDram and exclusive Nintendo instructions. 1080/30 capable Radeon HD w/tess. and exclusive Nintendo patented features. On a nice, tight bus that MS wishes they had on 360.
I appreciate the individual for sharing as much as he did. He was a little paranoid though (I can understand) and at one point thought I was leaking info on a messageboard under a different name, but wouldn’t tell me the board or the username, lol.
I’m sure some of you remember me talking about games being 720p. It’s because with this I knew devs would use those resources for 720p development. I’m sure some of you also remember me mentioning the bus. The key thing in this is the “Nintendo patented features”. In the context of things we talked about, it seemed to me these were going to be hardwired features. What is certain for now is that the die shot shows a design that is not traditional, fewer ALUs (in number) from where things supposedly started with the first kit, and GPU logic that is unaccounted for. I’ve seen some saying fixed functions, but that’s too specific to be accurate right now. Dedicated silicon would be a better alternative to use, though I say that as a suggestion. In my opinion I think lighting is a part of this. The Zelda and Bird demos emphasized this. Also in the past it was discussed how Nintendo likes predictability of performance. It would also suggest Nintendo wasn’t ready to embrace a “fully” programmable GPU and kept on the water wings when jumping in the pool.
I did what I could to get as much info on the hardware as possible since Nintendo was giving out so little. From there I gave the best speculation I could based on that info. As of today, I still stand by the evaluations I made about Wii U’s potential performance from all the info I could gather. And until Nintendo’s games show otherwise I’ll continue to stand by them because in the end it’s on Nintendo show what Wii U is capable of.
And if you think I deserve flak for what I’ve said in the past then I’m here, but you’re wasting your time trying because my view hasn’t changed yet.
I made the farewell post to hold myself accountable to avoid posting, but I haven’t done well sticking to that, haha. I wasn’t going to make this post, but since I was one of the primary ones gathering info it’s unfair to you guys to leave things as they were.
#130
Posted 06 February 2013 - 05:04 PM
So the next gen Microsoft and Sony can be around 3x to 5x the Wii U? much closer than this gen.
2x or 5x, but in other stuff less than 1x... like memory bandwidth, (wii u 146gflops and orbis that is the fastest is 162gflops) etc... also wii U and sony have dedicated sound chips!!! Xbox is using 1 cpu core for it!! Also wii U have a smaller chip, that 3dude said its a helping mini gpu, existed on wii also...
Edited by Orion, 06 February 2013 - 05:07 PM.
#131
Posted 06 February 2013 - 05:26 PM
This guy was right about everything before, I think he is right now.
Its kind of disappointing to see that Nintendo did reduce the amount of programmable shaders from 400-500 to 350 (1-2 SIMD cores), but they maybe put something more powerful instead of those cores!
- Dragon likes this
#132
Posted 06 February 2013 - 07:01 PM
Like he said, Nintendo has always been about predictable quality, so certain things they wanted taken advantage of probably have dedicated logic to pull those functions off the shaders themselves.@routerbad: Thanks!
This guy was right about everything before, I think he is right now.
Its kind of disappointing to see that Nintendo did reduce the amount of programmable shaders from 400-500 to 350 (1-2 SIMD cores), but they maybe put something more powerful instead of those cores!
#133
Posted 07 February 2013 - 06:36 AM
I too am thinking specialized lighting and tesselation logic (nintendo loves tesselation and deformation maps, they love it so much they even used it real time with the freaking wii of all things.)
and of course, i suspect the documentation nintendo sent out to devs to be missing a lot of information.... Seems like it always is...
Ooh, i was right on the money with starbuck. Its plastered right next to that high speed io.
Wow. This is no ordinary edram on the gpu. Its 32Mb psram (single transistor)+ 2mb even faster psram+ super fast sram.
And there are 2 60x busses to espresso. nice.
dsp also right next to the high speed i/o's and starbuck. Freaking eurogamer silly pony 'tech analyst'.
What was his name again? jeff butter? whatever his name is, he sure made a fool of himself.
Edited by 3Dude, 07 February 2013 - 07:02 AM.
- routerbad likes this
#134
Posted 07 February 2013 - 06:37 AM
I only ask because this is using Unreal 3, and Unreal 4 is apparently better, and while Unreal 4 is better. I'm more than happy for graphics like that for next gen games on the Wii U. Hell I'm happy with current gen, however I would like ports of next gen games running on this, even with a lower resolution and frame rate. I don't want any excuses this time around.
PS: I am disappoint at the Wii U CPU though, com'on Nintendo the Gamecube CPU...AGAIN?! Still it is still able to cope with current gen console games at under half the speed, so props to em *shrugs*
Edited by Penguin101, 07 February 2013 - 06:39 AM.
#135
Posted 07 February 2013 - 06:40 AM
Does anyone have any visual examples from PC games of what kind of maximum graphics this can pull off as current gen PCs will be more powerful that even the PS4. For example if it ran at 30 - 35 FPS at 720p would it be able to run the additions to the Unreal 3 engine shown in this video
I only ask because this is using Unreal 3, and Unreal 4 is apparently better, and while Unreal 4 is better. I'm more than happy for graphics like that for next gen games on the Wii U. Hell I'm happy with current gen, however I would like ports of next gen games running on this, even with a lower resolution and frame rate. I don't want any excuses this time around.
PS: I am disappoint at the Wii U CPU though, com'on Nintendo the Gamecube CPU...AGAIN?!
saying a 750 cx derivitive and a 750fx/gx derivitive the same cpu is like calling an i3 the same cpu as the i7.
Edited by 3Dude, 07 February 2013 - 06:41 AM.
#136
Posted 07 February 2013 - 06:41 AM
Does anyone have any visual examples from PC games of what kind of maximum graphics this can pull off as current gen PCs will be more powerful that even the PS4. For example if it ran at 30 - 35 FPS at 720p would it be able to run the additions to the Unreal 3 engine shown in this video
I only ask because this is using Unreal 3, and Unreal 4 is apparently better, and while Unreal 4 is better. I'm more than happy for graphics like that for next gen games on the Wii U. Hell I'm happy with current gen, however I would like ports of next gen games running on this, even with a lower resolution and frame rate. I don't want any excuses this time around.
PS: I am disappoint at the Wii U CPU though, com'on Nintendo the Gamecube CPU...AGAIN?!
The GPU is still a mystery. But it's in the ballpark of 1.5x-4x the power of the GPU in the 360.
As for the CPU, isn't it just similar to the GC CPU?
#137
Posted 07 February 2013 - 06:48 AM
If the answer is "yes, so long as publishers and developers don't get all pissey about doing that" then I'm fine I just don't want it to be another Wii situation where publishers can't be bothered because it means them paying another developer to use a separate engine to make another version of the game for Nintendo....
Edited by Penguin101, 07 February 2013 - 06:49 AM.
#138
Posted 07 February 2013 - 06:52 AM
I just want to know if I made the right gamble getting one of these. Can developers potentially port Next gen games with only a few minor things such as resolution and frame rate (I don't care much for 1080p or frame rate)?
If the answer is "yes, so long as publishers and developers don't get all pissey about doing that" then I'm fine I just don't want it to be another Wii situation where publishers can't be bothered because it means them paying another developer to use a separate engine to make another version of the game for Nintendo....
Most All those next gen engines are going to be be able to run in on tablets and phones, so the wii u being able to run them isnt going to be an issue.
Third parties being silly ponys on Nintendo platforms is going to be the issue.
#139
Posted 07 February 2013 - 06:57 AM
We don't know yet whether it will support UE4 though I suspect it will. I believe Frostbite2 and CryEngine3 are already said to be supported on Wii U. Buying a console isn't a gamble, it's an investment. Spending $350 some odd dollars is putting faith in Nintendo to deliver a worthwhile entertainment experience. And it's not like you can't own multiple consoles. It should handle "next gen" ports just fine, but I'd rather see multiplats that are built and optimized for Wii U.I just want to know if I made the right gamble getting one of these. Can developers potentially port Next gen games with only a few minor things such as resolution and frame rate (I don't care much for 1080p or frame rate)?
If the answer is "yes, so long as publishers and developers don't get all pissey about doing that" then I'm fine I just don't want it to be another Wii situation where publishers can't be bothered because it means them paying another developer to use a separate engine to make another version of the game for Nintendo....
- Scumbag likes this
#140
Posted 07 February 2013 - 06:58 AM
Most All those next gen engines are going to be be able to run in on tablets and phones, so the wii u being able to run them isnt going to be an issue.
Third parties being silly ponys on Nintendo platforms is going to be the issue.
I can cope with the silly ponys. Hopefully whatever Retro are working on will be visually stunning and prove to third parties that they're just whining *******
Edited by Penguin101, 07 February 2013 - 06:58 AM.
- Dragon likes this
2 user(s) are reading this topic
0 members, 2 guests, 0 anonymous users