Jump to content


routerbad's Content

There have been 1000 items by routerbad (Search limited from 20-June 20)


By content type

See this member's


Sort by                Order  

#167264 [Photo] Wii U GPU Die

Posted by routerbad on 05 February 2013 - 12:27 PM in Wii U Hardware

Current count I think leaves 16 of the blocks unexplained on the chip, thraktor believes asymmetric SPU's is a strong possibility considering the number of repeating groups of logic units that can't be readily explained. Looks like ~30-50% of the chip is unaccounted for so far. This thing could be a closet beast for all we know, and with the memory structure they are using they should be able to get much closer to whatever theorietical peak it has than anything we've ever seen before.



#171784 [Photo] Wii U GPU Die

Posted by routerbad on 15 February 2013 - 10:44 AM in Wii U Hardware

3Dude,

What are the chances this is based on the PPC476FP? I think that may have already been speculated before we had the die, but it fits quite well and looks like it would pack quite a punch.



#171153 [Photo] Wii U GPU Die

Posted by routerbad on 13 February 2013 - 01:01 PM in Wii U Hardware

Thats also easily accomplishable.


I assume they'll have sensitivity settings available to tailor the gameplay to your needs.



#171808 [Photo] Wii U GPU Die

Posted by routerbad on 15 February 2013 - 12:20 PM in Wii U Hardware

Im sure they could have done something like that. But it really looks like they went with a 750 solution.


okay, its fun to speculate :)



#171939 [Photo] Wii U GPU Die

Posted by routerbad on 15 February 2013 - 09:54 PM in Wii U Hardware

... not very knoligable on each of the cores =/. im guessing its not a fx or cancled vx/wx whaterver it was called. this one is probly a newly designed chip that doesnt link greatly to the other chips beyond the standered 750 series.

can anyone confirm if chip 0 or 2 is a braodway/upgraded broadway chip? or reltibly similar? or can run as a gimped version of one.

how much do developers know of the seperate catches and differences between teh core, or other differences between cores? asking because it could be a cause of weakness to xbox360 ports(assuming all its cores are identical) the worst optimized core can bring the whole system down if true.

at this point i think each has there own spacfic purpose. core 1(middle) dedicated to mass simple operations, such as controlling the pikmin in that upcoming game. core 0 and/or 2 dedicated to very complexe but reletibly few code, so handling AI, physics, and similar. physics i have run by the core 1 identifiying when something happened, then core 0 or 2 actualy doing the math. is there any weight to this theory?


Only core 0 is in use in Wii mode, the clock is lowered and much of the cache is locked.



#181975 [Photo] Wii U GPU Die

Posted by routerbad on 12 March 2013 - 11:59 AM in Wii U Hardware

Don't get too optimistic. Not bashing the Wii U, but its pretty clear that the PS4 is at least 3x more powerful. Still, more than enough for me considering the 20x power gap between Wii and PS3 and 360. 

Exactly.  How that translates into gameplay is anyone's guess at this point, though we know that it won't be very drastic at all, it still doesn't take away the PS4's best performer crown.




#173298 [Photo] Wii U GPU Die

Posted by routerbad on 19 February 2013 - 09:50 PM in Wii U Hardware

it proves to be an E6760 that nobody accepts easily, they dont want to!!.. lol look at this

http://www.nintendo....ch-specs/ it says : AMD Radeonâ„¢-based High Definition GPU

and look at this.

http://en.wikipedia....i/Radeon_R700 and this http://en.wikipedia....en_(GPU_family) Both are rv700 and r800 (4xxx gpu and 5xxx gpus) They are NOT AMD branded, it is ATI branded (amd bought ATI and the AMD gpus start from 6xxx series!!) http://en.wikipedia....ds_(GPU_family)

I pointed that to gaf, because they all speak for rv7.... They trolled me in IGN forums, they dont want to listen about e6760, they want to believe that wii U does not have the dx11 hardware features!!! lol funny but true.

that alone proves that wii U is using AMD radeon based gpu (As nintendo states) technology, that means 6xxx series and up. So it may not be 320 spus, but 480 instead!

Nobody can say that wii U uses rv700 chip, because it conflicts with the branding... it should be ATI radeon rv7 instead. Its not even r800 (5xxx gpu) because it is also ATI branding.

Even if AMD bought ATI and they are one company... AMD keeps the ATI branding for 4xxx and 5xxx gpus http://www.amd.com/u...on-hd-5000.aspx

but its not accepted by the reviewers.. hmmmm


It is true they didn't start branding the GPU's with AMD until the 6000Series, but I would assume any GPU produced from that time forward would carry AMD branding regardless of the base family from which its derived. I believe it is an e6760 as well, though if it were they have removed one of the SIMD engines. the e6760 has 6 SIMD engines, Latte has 4 (2 SIMD cores = 1 SIMD engine @ 80ALU/engine). Thats just the programmable shader piece. You also have a crapton of logic all over that GPU that could be fixed shaders, asymmetric shaders, who knows!

Also, it would still be 320 ALU's even if they went with e6760 SIMD engines, because there are two missing.



#171064 [Photo] Wii U GPU Die

Posted by routerbad on 13 February 2013 - 09:57 AM in Wii U Hardware

I've been following this thread here and over at neogaf, b3d and simliar discussion at gamerConnect. I am a newbie here and came from the 3ds forums. Great discussion here and I am glad to see everyone (the majority) keeping an open mind and level head compared to the other forums I mentioned.

I came across this link on b3d and this interview with the developers of Need for Speed Most Wanted for the Wii U really shines light on the potential power of system and the uniqueness of the gamepad. I think many naysayers will begin to quiet down if they watch and listen. This port is coming from the PC, not the PS3Xbox360 and the developer mentions how they were able to keep better textures, drawing distance etc because of the Wii U unlike the other ported titles.


Definitely interested in this, and I'm impressed with these developers. I've been following the other threads as well, and there are quite a few more Nintendo haters posting in them but there is still some great discussion going on over there. I think this thread here allows us to both react to what's happening on those threads and postulate our own ideas in deciphering this hardware. And we have 3Dude, they don't, lol.



#171802 [Photo] Wii U GPU Die

Posted by routerbad on 15 February 2013 - 11:58 AM in Wii U Hardware

I understand a ltd amount of this stuff but three times the power of a 360 overall IF the devs no how to code the hardware correctly,but seeing as Nintendo are keeping silent about exact hardware spec its not easy for the devs and so third party games will struggle unless the devs know there stuff.

Anyone please feel free to correct me if my overview thoughts are way off lol.


It still sports the same Power core instruction set, so games will run on the CPU, but should be optimized for an architecture that uses out of order instruction processing and has different FPU and SIMD characteristics than the PowerXCell PPE and Xenon cores. Those cores were designed specifically for floating point operations and were for all intents and purposes a modified PPC 970.

0% its definately a 750 series. Now more than ever since we can see it. I do see how the 476fp would fit many of the requirements... Though i think it would be even smaller than espresso if it were.

Espresso needed to be compatable with broadway for bc, something that can very easily be done to any cx or later vanilla member of the 750 family... But would be considerably more difficult with the 476.


Could they not have added the 750cl instructions to core0? It was pointed out that if it were a broadway core shrink it would be much bigger than it is, especially considering the increased cache. I would think it would be easier to add compatibility to a modern core rather than go through the process of shrinking a defunct core and adding instructions to it.



#167615 [Photo] Wii U GPU Die

Posted by routerbad on 06 February 2013 - 08:06 AM in Wii U Hardware

He may be right about all R6xx SPU-s on chip? 160 ALUs in total?

What do you think about e6760 SIMD-s theory?

He's said he doesn't know or care to know a whole lot about the specific graphics tech. The R600 registers would only make sense if they were needed for emulation, there could be some R600 based tech in there, but it isn't this GPU. R6xx was never produced at 40nm, never came close to the TDP necessary for Wii U. Hollywood, whether based on it or not, performed like an R6XX, we aren't seeing that level of performance here. It's much more likely that Espresso is based on an 80ALU evergreen SIMD core, considering what we've seen done on it.



#167461 [Photo] Wii U GPU Die

Posted by routerbad on 05 February 2013 - 08:28 PM in Wii U Hardware

Yeah, I am beginning to think that the Wii U hack is all smoke and air from his recent comments as of late. Especially when he said I could have provided Gaf with the details before Chipworks released the photo. So if he had the info all this time then why all the secrecy until today? He knew people wanted the information yet he decided to stay quite. Has anyone else tried to authenticate his claims about the CPU by chance?

Someone responded on twitter that GAF members were ignored when they did ask him.

If WiiU is using Cayman based SIMD cores it would be 8*64*2*.55=563GFLOPS. Everyone assumes that the SPUs in Espresso are based on R7XX VLIW5 cores rather than SIMD. Assuming Nintendo used SIMD cores, we are looking at somewhere between 563-704 GFLOPS. If they went with the older architecture at 40ALU's per SPU (which I'm reading was unorthodox for VLIW cores) we're looking at 352GFLOPS. Either way there is still nearly 50% of the die that can't currently be identified that could be additional asymmetric SIMD or SPU cores, or fixed function logic.



#169331 [Photo] Wii U GPU Die

Posted by routerbad on 09 February 2013 - 03:25 PM in Wii U Hardware

The wierd thing about reading this thread, is it reads like no one has actually seen wii u games in real life. The eurogamer article seems right on the money. The wii u is clearly current gen performance overall but with a higher performance gpu but lower performance cpu.

If you are reading this thread and considering a purchase of wii u I would check out real world performance first. I'm loving the wii u but this thread actually reads like people are expecting some massive improvement in wii u performance and that current wii u games aren't the real benchmark of what the wii u can achieve.

Some of that silicon will be the original wii gpu and 3 megabytes of memory. Maybe the 32 megabytes of fast video memory doubles as the 24megabytes of main wii memory.

Dare I mention the real world where some wii u games have reduced resolution, weaker frame rates, missing a.i, missing physics, reduced sound channels, missing 3D support, missing graphical effects and I'm sure other things I've forgotten.

Even Nintendo's own titles Nintendo World and Mario are technically weak.

The wii u is NIntendo's entry into the current gen with regard performance, in some ways its stronger and in some ways its weaker.

I'm sure a big part of the failure of the wii u so far is its weak specification. We all wish it had a stronger spec but it doesn't. The fact is, the wii u has absolutely no chance of competing with the new xbox and playstation on a technical level. It's only just competing with the current gen consoles.

I strongly suggest people who really want to understand the potential performance of the wii u look elsewhere on the internet to more balanced viewpoints and always factor in realworld performance of games.


Your argument applies to basically every console launch in the history of gaming save the first gen. That's why everyone is curious about what the system is capable of, because it will still take years for developers to realize and take advantage of its full potential. So no, current titles, built for older hardware or ported from older hardware with completely different performance characteristics is not a benchmark for what the system can and will do. You talk of potential performance like the conclusion was written before the start.

If you were reading this thread and think we are making a case for or against buying a wii u please reread, like, all of it. Can't some people just geek out a little without people taking the entire conversation too seriously from a sales perspective and making sweeping conclusions about the intentions of Nintendo or the fate of the console itself? Whether good or bad, learning about the deep level hardware characteristics and discussing them is fun and exciting for some people.



#167962 [Photo] Wii U GPU Die

Posted by routerbad on 06 February 2013 - 07:01 PM in Wii U Hardware

@routerbad: Thanks!

This guy was right about everything before, I think he is right now.

Its kind of disappointing to see that Nintendo did reduce the amount of programmable shaders from 400-500 to 350 (1-2 SIMD cores), but they maybe put something more powerful instead of those cores!

Like he said, Nintendo has always been about predictable quality, so certain things they wanted taken advantage of probably have dedicated logic to pull those functions off the shaders themselves.



#167445 [Photo] Wii U GPU Die

Posted by routerbad on 05 February 2013 - 07:28 PM in Wii U Hardware

Trinity uses a Cayman GPU core. 24 SIMD Cores with 64 PE's per SIMD. Clocked at 880MHz looks like ~2.7TFLOPS

If Wii U uses the same SIMD cores, with 64 PE's per core, its 8*64*2*.55=563GFLOPS



#168893 [Photo] Wii U GPU Die

Posted by routerbad on 08 February 2013 - 02:00 PM in Wii U Hardware

So are we looking at a highly customized e6760 with fixed function after all?

If so that would easily equate 3-6x xenos.


With everything we don't, and everything we do know, I'm leaning toward e6760 based SIMD engines, though 4, not 6 of them. obviously my opinion on the matter is next to worthless. I think they may have derived the design from R7XX, but I think given the level of customization and the costs associated they would have gone with a more modern shader engine than R7XX, along with some baked in lighting and tessellation hardware.



#168118 [Photo] Wii U GPU Die

Posted by routerbad on 07 February 2013 - 06:59 AM in Wii U Hardware

dsp also right next to the high speed i/o's and starbuck. Freaking eurogamer silly pony 'tech analyst'.

What was his name again? jeff butter? whatever his name is, he sure made a fool of himself.

Agreed.



#168114 [Photo] Wii U GPU Die

Posted by routerbad on 07 February 2013 - 06:57 AM in Wii U Hardware

I just want to know if I made the right gamble getting one of these. Can developers potentially port Next gen games with only a few minor things such as resolution and frame rate (I don't care much for 1080p or frame rate)?

If the answer is "yes, so long as publishers and developers don't get all pissey about doing that" then I'm fine :) I just don't want it to be another Wii situation where publishers can't be bothered because it means them paying another developer to use a separate engine to make another version of the game for Nintendo....

We don't know yet whether it will support UE4 though I suspect it will. I believe Frostbite2 and CryEngine3 are already said to be supported on Wii U. Buying a console isn't a gamble, it's an investment. Spending $350 some odd dollars is putting faith in Nintendo to deliver a worthwhile entertainment experience. And it's not like you can't own multiple consoles. It should handle "next gen" ports just fine, but I'd rather see multiplats that are built and optimized for Wii U.



#190394 Wii U's RAM is slower than PS3/Xbox 360.

Posted by routerbad on 05 April 2013 - 06:58 AM in Wii U Hardware

It just struck me as strange a company like Nintendo would throw all their rams around the place and have them on a single bus.

I'm not good with maths though.

But it is on a single bus.  That's what we were trying to disprove from the anandtech article, because that guy seems to believe that locking down to 1GB of RAM means that you are segregating the chips onto separate buses, each running at half speed.  Since we can show that all of the lanes from Mem2 are going directly into the GPU, we know that there is only one memory controller and everything is on one faster bus, which would allow it to benefit from bank and component interleave, as well as dual channel.

 

Then we went and showed that Anandtech's numbers were just..off. 




#193518 Wii U's RAM is slower than PS3/Xbox 360.

Posted by routerbad on 12 April 2013 - 06:57 AM in Wii U Hardware

I think the ARM core is more about networking related services, although it remains to be seen if the ARM does the video encoding or the AMD, seeing as both can come with the necessary hardware encoding support.  I suspect the AMD though as it has hardware assist in there as standard so why add the complexity of finding a way to give the ARM access to the frame buffer, when the AMD already can do that.

 

A good point about that 15 minute buffer but seeing as there are PVRs using 2.5" HDDs I don't see why we should discount the HDD as being used for it.  That said, I see no reason why they couldn't fit the entire background OS and 15 minutes of encoded footage into 1GB of RAM.  Its one of the reasons the Wii U OS frustrates me so much as it should be lightening fast with all that RAM reserved for it.

Using the hard drive to store encoded media is fine, but it wouldn't be as instantly accessible as they touted, there would be a slight wait for it.  The problem is that when you are using a RAM scratch disk, which they would almost have to be, it wouldn't normally transfer that information out to disk until the entire encoding operation is done, so it would keep 15 minutes of encoded media, along with the RAM that is used to actually do the encoding, and it would constantly be updating it, and removing stale (past 15 minutes) A/V data.



Yeah, the mempry controller has been.a main.point of interest for me for a while now. Ot appears to bd integrated into the mcm or even gpu, which would grant a considerable effeciency boost.

Integrated onto the GPU would be ridiculously efficient.  I think it may sit right outside but on the MCM, theres a block of 8 transistors on the edge where the RAM comes into the MCM with what looks to be an I/O block or termination point.




#190389 Wii U's RAM is slower than PS3/Xbox 360.

Posted by routerbad on 05 April 2013 - 06:41 AM in Wii U Hardware

MT41K256M16HA-125

Is the micron ram found in another teardown.

its wired 32 megs (mb of course) a pop by 16x 8 banks inside each 4gb chip.

So, good.



Also affixed on a 96 fgba, 10x 14mm in size, that samsung on its 84 fgba is looking completely like an odd man out.

huh.

Found a samsung dram brochure.

with that rams nomenclature with different specifications more in line with the others than that samsung ram overview page.

K4W4G1646B -HC(12/11/1A). 96-FBGA.

Could be a mis printed label.




#191685 Wii U's RAM is slower than PS3/Xbox 360.

Posted by routerbad on 08 April 2013 - 02:45 PM in Wii U Hardware

Your treating the edram like its a cache only, its not.  The fact that the GPU can keep the framer buffer, z buffer, and handle AA all in the edram make a huge impact on performance.  Take for example a PC game, turn the textures from low to high and see how it affects performance.  Now take that same game and bump from 720p to 1080p.  I can almost guarantee you that the resolution increase impacted performance quite a bit more than the texture quality increase.  Now start adding AA and watch you performance start to drop rapidly.  High quality assets arent your biggest bandwidth hogs, and I get the idea that you think they are.  You could use high end PC assets in a game and if your rendering it in 480p with no AA, your memory bandwidth requirements wont be very high.  

 

You have to remember that file size and bandwidth are totally separate.  For example, I could have a 1MB file stored in the ram that can eat up more bandwidth than a 10MB file.  If that 1MB file is fetched 15 times per frame, and the 10MB file fetched only once, then the 1MB file is a bigger bandwidth hog.

 

Its why I wanted to find a modern graphics card that had just 12.8GB/s of video memory bandwidth.  Guys are using the AMD HD5450 and getting  games like Need for Speed Most Wanted to run in 720p around 25fps, and 600p 35+ fps.  This is on PC, not optimized and the 12.8GB/s is all the graphics card has, unlike the Wii U that spares the main memory so much of the bandwidth because of the edram.  Running Wii U quality may cause it to run at like 10fps, but thats the point, edram is the difference maker.   

I understand all of this, I was simply wondering what the interface between edram and main ram allowed for.  

 

For 3Dude, the latency on these chips is straight 9s, pretty much standard latency figures for 1600 (latency times around 10ns)




#189701 Wii U's RAM is slower than PS3/Xbox 360.

Posted by routerbad on 03 April 2013 - 12:12 PM in Wii U Hardware

But looking at charts for DDR3-1600 the memory clock is actually 200mhz, with the bus clock being 800Mhz, and then double data gives the 1600.  So they are already giving you the 4x multiplier when you read the 800Mhz. 

The memory clock hasn't really changed from generation to generation of DRAM.  smaller processes (DDR3 started at 90nm) but largely the same memory arrays.  The bus is the main determining factor in throughput, an "all things being equal" situation.  All things being equal, DDR3 has exactly twice the throughput of DDR2, and by extension GDDR3.  GDDR3 has a little better throughput by bumping the bus clock significantly, but has to wait for memory refresh much more.




#185574 Wii U's RAM is slower than PS3/Xbox 360.

Posted by routerbad on 23 March 2013 - 09:36 AM in Wii U Hardware


Plutonas, on 22 Mar 2013 - 13:24, said:nintendos problem may be, not the specs... but the lifespan chart they got for their consoles.  They kill their consoles very fast!
An example, ps3 and 360 still selling and making games correct?  Going for the 8-9th year!!... Look the games back in the 5th year and compare them with the 9th...  That means, nintendo doesnt allow developers to look into the console, to progress their knowledge and advance their games.
But I remember what Miyamoto said:  games are not art, they are just products.. that means, they have different ideology in things... if they complete their cycle with games like, 1-2 marios, 1-2 zeldas and 1 pikmin, we move to the next... just like that.. lol I dont like that in nintendo, I prefer it with more lifespan... NOT 8-9 years, but at least 7!!!  (wii stayed alive for 6 years, they made a progress).
well.. GC was almost the same as Wii... That was a kind of familiar for developers, but very bad for nintendos customers... They had to change 2 consoles be able drain all the resources from the GC tech. Remember how xenoblade looked? amazing...
Here is a nice suggestion for a mass gallop nintendo fans must do and send over to nintendo... NOT to kill wii U in the 5th year, but extend its life, its good for our pockets too,
ps: I delayed to buy my wii U because of its price, they sell it here extremely expensive... Now I wonder if I have to wait for wii U 2... lol


Every console generation in gaming history has on average been 5 years.

This generation is the only one in history to go on this long because the busines model isbroken, innovation is completely stagnant, game publishers own the media, and we keep getting the same games over and over and over and over and over again.

There is NOTHING good about this generation. It is the most disgusting generation of gaming i have ever seen, and its been dragged on YEARS padt when it should have ended.

TOO.  MANY.  SHOOTERS.

 

This console generation should have ended three years ago.  It has gone on for way too long and even with the next generation we can see mostly uninspired hardware design from Sony and Microsoft.  They are trying to make their consoles as PC as possible while adding kitschy social features that do nothing but detract from gameplay. 

 

I am glad this generation is finally ending, and if all we get from Sony and Microsoft and EA, and Activision, and the other big name publishers is shooters, I'll just stick with Nintendo




#188605 Wii U's RAM is slower than PS3/Xbox 360.

Posted by routerbad on 31 March 2013 - 11:58 AM in Wii U Hardware


routerbad, on 31 Mar 2013 - 05:54, said:I don't think they would separate the bus on the silicone, RAM access through the memory controller would be 64 bit per chip, but the software doesn't know that, the memory controller is handling 2GB of DDR3 on a 256 bit bus, and is able to use all of that speed to load/unload RAM, unless they have two chips on one controller, and two on another.  It doesn't have to partition RAM based on the chips themselves, and it doesn't have to place data sequentially within the chips either, and it's more efficient if it doesn't.
The controller and CPU/GPU would access RAM indiscriminate of the hardware separation.  It recognizes 2GB and addresses it all the same, and unless they lock the games to only using specific address tables, it's up to the memory controller to make the big decisions each cycle based on shortest wait.

You havent seen the mother board?

Its not certain, but its heavily looking like the ram is seperated into groups of 2 chips, each with a 128 bit bus, leading seperately into different sides of the mcm.

Certainly not set in stone, but a very solid foundation. Though i wouldnt be one to complain if nintendo starts allocating some of that untouchd Gig of ram to game side.

Regaurdless.... The topic at hand 12.8 misinformation has hereby been officially debunked.

Its 25.6.

Easily 25.6  I saw the motherboard, but I may need to take a closer look, the back of the mobo the RAM leads all seemed to head into the same side of the MCM, but I may be wrong.  It would make little sense to use two separate memory controllers.




#189835 Wii U's RAM is slower than PS3/Xbox 360.

Posted by routerbad on 03 April 2013 - 03:32 PM in Wii U Hardware

If anything wouldnt the GPU having so many pins/lanes debunk the idea of 16bit right away?  The GPU is custom, so there would be no point in having more memory lanes at the GPU than required. 

The fact that it's DDR3 should have debunked it right away.  They applied a 16 bit width per chip which isn't the case in a dual die package.  They should have assumed 32 bit per chip or count the lanes, which is normal for graphics applications.




Goodtwin, on 03 Apr 2013 - 09:11, said:If anything wouldnt the GPU having so many pins/lanes debunk the idea of 16bit right away?  The GPU is custom, so there would be no point in having more memory lanes at the GPU than required. 



------------------------
Exactly.


routerbad, on 03 Apr 2013 - 09:07, said:Just did another count, 158 pins.  79 per side.  Unless I'm counting some of them wrong, a few of them are hard to make out.


That gddr3 64bit card has 63 pins.

I think weve had a breakthrough. I thought those lanes were abnormally large.

158 pins... 158 bit bus????

(fixed, lmao)

800x2x158=252,800/8=31,600

31.6GB/s?

I'm good with that.  What's going into the GPU is going to tell the whole story with regard to the bus width.


Interesting, maybe we're missing 1 pin per channel here.





Anti-Spam Bots!