Jump to content


Photo

Wii U will be N's first completely new hardware since Gamecube [TECH-DISCUSSION]


  • Please log in to reply
23 replies to this topic

#1 parallaxscroll

parallaxscroll

    Spear Guy

  • Members
  • 95 posts

Posted 21 September 2012 - 11:56 PM

The last time Nintendo had a truly new hardware architecture was with the Dolphin (Gamecube) which was revealed fully in August 2000 at Spaceworld. I think all of us understand that the Wii was just an overclocked Gamecube with more RAM. Gamecube architecture was designed in the late 1990s by ArtX(ATI) and IBM. The Wii U will be Nintendo's first truly new console architecture since the Gamecube.

So in this thread we will look back with videos & articles that I will post. Feel free to post what you find, too.
(my apologies for the VERY long post, but I wanted to get so much in)

To start here are some videos from Spaceworld 2000.

Some of the tech demos are real-time, while others were pre-rendered CG/FMV.

montage


128 Marios



Here's an outstanding IGN interview with ATI's Greg Buchner on Gamecube's Flipper GPU:

ATI Discusses GameCube Graphics

Covering white paper design to final silicon, we interview ATI's Greg Buchner about GameCube.



by IGN Staff
OCTOBER 29, 2001







At Nintendo® Space World 2001, IGNcube had a chance to sit down with Greg Buchner from ATI to talk about the design and evolution of GameCube's graphics chip, code-named Flipper. In the lengthy, technical interview Greg reveals much about the decisions that were made and what we can expect from the console in terms of visuals in the future. Get a closer look at the formation of Flipper in this in-depth interview.
IGNcube: Can you discuss your position at ATI and how you became involved with Nintendo® and the design of the Flipper graphics chip?
Greg Buchner:
So, going back in history, in 1997 a lot of people left SGI (Silicon Graphics Inc.), which wasn't doing well, so a bunch of us started ArtX and we aimed at doing graphics in the PC space. In early '98 we started talking to Nintendo® about being their provider for the graphics and system logic for what has become GameCube. At ArtX I was vice president of engineering and part of the founding team of ArtX.
In April of last year we joined ATI through an acquisition, which ATI aquired as a way to get into the [home console] space and as a way to get another graphics development team working in the integrated graphics PC space. So, through the acquisition I've maintained a similar role for the team at Santa Clara. ATI already had a team at Santa Clara plus the addition of the ArtX team. More recently I'm operating in a more technical role, I'm giving advice on how we build chips.
IGNcube: What is your official title now?
Greg Buchner:
I'm vice president of engineering, ATI.
IGNcube: You say you began talking to Nintendo® in 1998. So from white paper designs and initial design to final mass production silicon how long was the development process?
Greg Buchner:
Well, there was a period of time where we were in the brainstorm period, figuring out what to build, what's the right thing to create. We spent a reasonable amount of time on that, a really big chunk of 1998 was spend doing that, figuring out just what [Flipper] was going to be. In 1999 we pretty much cranked out the gates, cranked out the silicon and produced the first part. In 2000 we got it ready for production, so what you saw at Space World last year was basically what became final silicon.
We've probably tweaked it a bunch since then and even [after the September 14 Japan launch] other versions are being tweaked. It will forever be in a cost production mode, so to say there is final silicon is something that doesn't really happen because these products live for so long. All the tweaking is for costs, everything for the last six months or even more than that has been related to getting the cost down. So over time, you know it's debuting at $199 obviously that's not the end game. We want to keep pushing the price lower and lower. So we'll continue to help NEC and their cost production efforts.


Posted Image

IGNcube: Can you describe how the brainstorming process worked with Nintendo®? Did you approach them with ideas, did they come to you? How did the relationship work?
Greg Buchner:
It was kind of back and forth. A lot of us worked on the N64 and at that point they certainly didn't have any 3D graphics knowledge in the company. For this round, some of the team that was at SGI that worked on the N64 is now at Nintendo®. So the group up in Redmond at NTD, Howard Cheng and Rob Moore's group -- both of which worked at SGI on the N64 -- had some expertise through hiring people on the 3D side of things. They were really the link into the developers, because the whole theme of this product early on was targeting the developers, they are really the customer for us not those that played the games.
So Howard's team was really that bridge into the developers' mind. Everything was really a collaboration between our team and Howard's team -- some of our discussions were friendly and some were "passionate" about what the right thing to do was. At the end of the day there was always the theme of the developers as well as cost. Anything we discussed typically had trade-off with cost. You can go do almost anything, but everything comes at a price. It's about figuring out what's the right thing to do at this point in time, so that was a big part of the collaboration.
IGNcube: You talked about having a vision for the chip and we've heard a lot about it being developer friendly. Was there a specific mantra that the team had? In a few sentences if you could describe what the main goal of the chip was, what would that be?
Greg Buchner:
There are so many pieces that factor into the decision. There's looking at the developer, looking at the development process, and making them as efficient as possible so they can make their money. The more money they make the more successful the overall products are going to be and the more successful from a selfish-intent point of view we're going to be from the royalty stream. The lower we can make the cost of the system, the more it will open up for a broader base of consumers that can buy it. So that was a very important thing.
Predicting what the technology is going to allow us to do. So you have to look into a crystal ball and figure out what's going to be fashionable and important, what's going to allow the Miyamoto-sans of the world to develop the best games. So, again, it's kind of taking your best guess at it. These are the kind of things you have to get, put them into a jar and shake them up and they all become very important to deciding what the product is going to be. You could add more and we'd be sitting at the same price point as the PlayStation 2, but I think we're already better than that at the price point we aimed at. I feel like it was the right combination of things.

















Posted Image
There is only one Miyamoto-san in the world.

IGNcube: When did the decision for the sound chip come in? Was that there from the beginning that it was going to be integrated on the graphics chip?
Greg Buchner:
Certainly not from day one, but within the first six months we had already picked that direction. So by the middle of 1998 it was known, and a partner was chosen for that. The performance level of it was probably tweaked a little bit over time. The interface to memory was the thing we changed over time. The idea of the A-RAM was something that evolved probably in 1999. Originally there was something else that was there.
There was another memory out there for something else and we kind of figured out a good way from a cost point of view. Instead of having two memories that were partially used we would have one that was more fully used. It was the better way to go, so there was some structural changes to the system [later on] but from a basic 10,000-ft (Editor's note: Greg is referring to a very general, non-specific view) block diagram you could say middle of 1998 that decision was made.
IGNcube: With the embedded RAM, was that a decision from the very beginning or was that added at a later date?
Greg Buchner:
That was actually one of the "passionate" arguments, because making that step there's a huge benefit to system performance but there's also the addition of risk and cost. Nothing in life comes for free. It's one of those things [when decided] it changes what we want to do from a technology partnership with NEC, what kind of process we need, from tools, and it brings a new partner, MoSys, into the mix. It limits the choices of silicon providers because there's not many people who can do something like that. In fact very few people can do what NEC has done with this. They've done a phenomenal job.
So that was a decision where we said from a practical point of view, "Do we want to do this?" and just had a very rational discussion on pros and cons. In the end clearly we get a huge benefit. Not only from the embedded DRAM, but from how we structured it. One of the other products out there has embedded DRAM, but arguably they're not getting all the bang for the buck. They've got the cost in the silicon from a process point of view, but as for the performance in memory I don't think they have what we have -- or anything close to it.
IGNcube: Seeing what developers are doing with the chip right now would you say that the team made the right decision.
Greg Buchner:
Yes. I think it's worked out well. It took a lot of very hard work to get where we are, but I think it was definitely worth it.

















Posted Image

IGNcube: Over half of the chip is embedded RAM, right?
Greg Buchner:
On the version that shipped at launch it's on the order of a third. From a transistor point of view it's about half, but because it's a very regular structure it is very, very dense. So that half-transistor results in a much smaller area. So from an area point of view actually a little less than a third.
IGNcube: Is transform performance one of the big fights you had with using eDRAM, which takes up space?
Greg Buchner:
That actually wasn't an issue. Those two are very separate discussions. You look at the embedded DRAM and it's going to be for performance on the fill rate side, and that's a cost trade-off. To get that kind of bandwidth with an external device, forget it, you're not going to even come close. So there's a huge benefit we get by having it.
Transform is a separate topic almost: how much do you shoot for, what's important, what are the typical cases with what developers are doing. Not many people send down triangles of the same color and never change anything else. It's these kind of fake benchmarks that are irrelevant. And so they're not streamed to data that is ever showing up in a game, so what's the point in measuring them? So what we went after is what's really happening in a game, what's really happening from a content creation point of view. We optimized around what the data patterns looked like and made a machine that screams for those kind of patterns.
IGNcube: If you had to pick one main feature on the chip that you thought as most important or most impressive what would it be?
Greg Buchner:
Hmm, there's a lot of them. [Laughs] From an overall, machine architecture point of view it is a very, very clean architecture. So, again, back to the 10,000-ft level it's a sweet machine, it's just so clean and there aren't a lot of quirky behaviors. There are very few things that I would put in the quirky behavior category. So it's allowed the developers to go focus on making the games. From a raw feature point of view, in the texture area, the texture combining, what we can do with textures, how one texture can manipulate another texture...in the area of textures I think we've greatly extended what people can do and the effects that they can create. So that's something over time I think you're going to see continuing improvements as developers say, "Hey, I've got this incredible toolbox now for assembling things" I don't think that's an area that's been tapped yet. So, I think over time you're going to see better feature effects coming out.
IGNcube: So you think textures will be especially impressive?
Greg Buchner:
I think that's one area where we've made a big leap in algorithms or the potential algorithms . Not diminishing any other area, everything about the chip is wonderful. But if you ask me to pick one area, I'll choose [textures].
Tune in tomorrow for Part 2 of the interview where Greg expands on the texture functionality of Flipper, the alteration of the clock speed, Zelda's cartoon-style cel-shading, and more!



http://www.ign.com/a...mecube-graphics


ATI Discusses GameCube Graphics (Part 2)

Wrapping up our discussion we cover Flipper's texturing abilities, the new-look Zelda, and more.




by IGN Staff
OCTOBER 31, 2001







In the final part of our ATI interview, Greg Buchner discusss Flipper's texturing abilities, the alteration of the clock speed, and expands on just how hard it is to turn the Legend of Zelda into a cartoon. Mr. Buchner also comments on the future of ATI and Nintendo®, hinting towards a future partnership. Read all about it in this in-depth interview.
IGNcube: Can you talk a little bit about the texture processing power? We know you have your secrets, but for instance on the GameCube spec sheet you list eight simultaneous textures as a feature. Can you explain how that works?
Greg Buchner:
The magic, I would say, is not in the fact that there are eight of them, but how the eight go together. How the eight can interact with each other, how one texture can modify another texture, or how you blend them together -- so that's the real magic. That fact that you can do one texture and then do a second, well whoop-de-doo. And a third one, or even twenty of them that's really just silicon. How you put them together and the intelligence of putting them together, and how you expose that in an API so developers can properly control it, that's really the magic.
IGNcube: These eight simultaneous textures, how much does it hurt performance to utilize them as you move up in numbers?
Greg Buchner:
Look at the applications.
IGNcube: Such as Rogue Squadron II? Yea, we can see that's doing a lot. Any specifics you can talk about?
Greg Buchner:
I won't give out any numbers. I'll let Nintendo® do that.


















Posted Image



IGNcube: When the clock speed changed around last year's E3 you said it was to balance the system out. We were surprised to see you took back the speed of the graphics chip because it is so powerful. What would you say is the difference in power between the former 200 MHz chip and now the 162 MHz chip balanced with this newer, more powerful CPU?
Greg Buchner:
Going back to an application point of view, having more of a balance between the two is going to make the game better at the end of the day. Having one piece of the system have much, much more performance than another, well maybe from a benchmark or contrived case point of view you might get something for it, but if you have one weak link in a chain, you still have a weak link. I'm not saying that the CPU was a weak link before, but having more balance there made a lot of sense. And due to some ratio issues there, the ratio leads to having these kinds of fixed multiples. We wanted to get more CPU [power] and to get that number we couldn't find the right multiple there, so we had to actually drop [Flipper's speed] to make them line up.
IGNcube: Why is it set to such certain multiples?
Greg Buchner:
It's integer multiples of each other. You've got a 3:1 ratio.
IGNcube: Is that because of the whole architecture of the system that you can't do any other multiple variations?
Greg Buchner:
It's a very common thing for a CPU interface. So if you look at the PC world it's the same thing, you've got the front-side bus that has, you know, a certain clock rate and there's multiples of that to the CPU. Right now it's kind of in 50 MHz multiples on the Intel side. They had some busses that were 33 MHz or 66 MHz intervals. So you saw these weird steppings, like you saw a processor that was a 733 MHz and then when they went to Pentium III it was suddenly 500, 550, 600, etc. So there are these ratios, and basically what's happening is you're trying to have a bus that's synchronous between two devices and you need to run at a fixed multiple. If you made them asynchronous, then what you introduce is latency. If we had started day one and said we wanted to have a different set of multiples, we probably could have worked with IBM and we might of come up with something different that had finer grain multiples. But as developers got their hands on the system, seeing what they could do with it, we though, you know, here's a nice change to make. At that time, with what was in our toolbox that was a choice we could make.
IGNcube: In the future will any of Flipper's technology be integrated into ATI chips?
Greg Buchner:
The team that worked on Flipper has been working on products in the PC space for quite some time now. What you saw at [last year's Space World] was basically the "final silicon," so that team's been off for over a year now doing something in the PC space. So, yes you'll start seeing technologies, ideas, but you're not going to see an exact piece of that chip coming out and being used in the PC space. PC APIs are different, the requirements are different. A lot of the concepts, a lot of the things we created are ATI property, ATI owns the ideas and we're going to exploit those in other areas. So, stay tuned there.
IGNcube: The CPU interface we've heard a lot about. At the recent Embedded Processor Forum they had a shot of Luigi's lit face from Luigi's Mansion when they were talking about the CPU. Saying that the CPU was used for the really up-close lighting. Could you describe the relationship between the CPU and graphics chip and how important it is they work together?
Greg Buchner:
What we've done in Flipper is optimize things that are happening a lot. Things that are common to do in a game, common to do in an application that really need to make high-performance you move those into dedicated silicon. You're basically running a program in the dedicated silicon. Things that you occasionally need to do you may not want to go and dedicate a lot of gates to them which adds to the cost, so you want to have a very good general purpose CPU that can run an algorithm. You might have some really cool special effect in lighting for example, that you may not have dedicated gates for but you've got now this powerful CPU. So you go do, those set of triangles you might do some lighting algorithm on there so that might have been what was referred to [at the Emedded Processor Forum].
IGNcube: How directly is the Gekko CPU and Flipper graphics chip linked?
Greg Buchner:
Looking at a block diagram, Flipper is the center of the universe. Everything else connects to Flipper. So the CPU plugs directly into Flipper, main memory plugs directly into Flipper, controllers plug into Flipper, flash cards plug into Flipper, the expansion slots for the modem or broadband adapter plug right into Flipper, the digital video out plugs right into Flipper. The only thing that doesn't plug into Flipper is a set of DACs (digital analog converters) for the TV out and the audio. So that communication between Flipper and Gekko is very important.
















Posted Image



That's one of the areas where we went in and made some changes to the standard PowerPC bus to improve the performance. It's till very much in the spirit of the standard PowerPC bus, but we added some enhancements. You get in some cases much higher bandwidth than you could normally get. Usually you can take the clock rate times the number of bytes, and that's your not-to-exceed number. Rarely does anything allow you to get that number. What we've done is made it so that in some cases we can get very, very close to peak. What you typically get is within 50-75% of peak for nice data patterns.
IGNcube: So when you say "peak" on the spec sheet, you're getting pretty close to it?
Greg Buchner:
In certain cases. There's a pattern you can get which gets you very close to peak. There's another set of things that you're not going to get to peak just because of the way that the protocol is set up. But it's still better than the other two guys that are out there. It's still a very good bus.
IGNcube: How close did you work with IBM and when did that relationship start happening?
Greg Buchner:
In the summer of '98 we had what we call the Beauty Contest to look at the processors that are out there that we'd want to use as the basis for GameCube. We looked at a lot of different choices and went through a lot of different things and made many comparisons. And by combinations of raw performance we chose what would be a good basis for building the system. We knew we weren't going to find something that was perfect for what we wanted, so we figured we needed a team that we could trust to go do some changes, could deliver on a schedule, and also deliver on a cost point.
We narrowed it down to a couple of choices fairly quickly, did a lot of detailed analysis and in the end this was the right choice. Then the business relationship had to work out between IBM and Nintendo® at that point. Once that settled then we said, okay, it's a good baseline and part of the business discussion was that IBM was going to have to make some changes. Even though they thought it was a good general purpose CPU, and it is, there's some things that it's missing to do good gaming. So we had them add those things. That was a very close relationship between all three parties, because Howard Cheng was specifying some of the changes, we were specifying some of the changes, working through the changes, and certainly all the bus changes had to be very tightly coupled. So there were daily phone calls for quite a while, frequent meetings, and things like that. It was a good relationship, and we ended up hiring one of the guys from IBM's teams.
IGNcube: The new cartoon look of Zelda is amazing. We saw the cel-shading demonstration at last year's Space World, but it was a relatively quick demonstration. It was great to see that it could be used so easily, but Zelda is so far beyond that. Maybe you could describe what's going on there? We think a lot of people underestimate the technology. From a technological standpoint what would you say it's doing?
Greg Buchner:
[Jokingly] Well if it wasn't such a big deal, people would have done it a long time ago. It's not easy. It took a lot of work. There's some extra stuff we did in order to enable that. There's some things we put in the camp of inventions and there's some patents that have been applied for in that area. It's not an easy thing to do, and even though in the end, oh gee, you're creating sort of a 2D surface instead of 3D that should be easier, but it's not. Finding the contours, finding where the edges go is a very difficult problem.
IGNcube: Would you say that the other two players on the marketplace could do that kind of thing?
Greg Buchner:
If they didn't focus on it, I... Well, it's not something that falls out for free. I don't know what either one of them has done, but haven't seen any evidence from either one of them so I'd say we're probably unique in that regard.
















Posted Image



IGNcube: Do you have any favorite games, speaking from a technical standpoint? You know what Flipper can do, do you see any that are pushing it a lot faster than you thought it could be pushed?
Greg Buchner:
Tough question because I don't want to single out anyone. Um, I guess overall there's been a couple things that I've seen where you kind of get that chill down your spine and say, this is some really great stuff, impressive stuff. Even knowing the horsepower of what we've provided the developers, just seeing what they've done. The impressive stuff to me, you know I'm going back a year, is where a developer rode away with a development strapped to the back of a motorcycle and five days later you're looking at one of these demos that were shown at Space World last year.
To me that was impressive, because that really showed off how easy this machine is to use. They got really high performance, like the Star Wars demo from last Space World. That was basically done in like five days. That stuff was just amazing. But I think the machine hasn't been fully tapped, it's going to be a while before people do tap into it. There's a lot of focus to getting the first games out, you know the initial launch titles. I think in the second year they'll really get a chance to dig into the machine and start figuring out some things. You know, dig a little deeper in the toolbox and see what's there. I don't think we've seen all that's going to come out of this by any way, shape, or form.
IGNcube: Do you think that because it's proving to be such an easy thing to program for that we'll see the difference in generations? With the N64 you had specific first, second, and third generation titles. You could pick them apart. You could look at a game and see that. Do you think we'll see that kind of separation on GameCube as well?
Greg Buchner:
I think you'll see a different kind of separation. I think you'll still see one. On N64 it was difficult to program and so therefore the second year people figured out how to get a little more performance out of it. But here I think even though they are starting higher up the performance curve there's still a rich feature set that I don't know has been fully tapped because people haven't had all these tools in their toolbox before.
















Posted Image

So there still developing things with now just high-performance taking some advantage of some of the multi-texture capabilities, but I think a year from now you'll see newer ideas and newer things you can do. So I think you'll start seeing much more on the features side and less and less from the performance side. So you'll see this higher rate of polygon counts and fill rate, but in terms of what you do with that and what effects you get I think you're going to see a lot more. So it's a different level of changes over the years.
IGNcube: Where from here does it go for ATI and Nintendo®? We saw your logo on the GameCube, so you must have a great relationship with them. Do you think you'll be working together in the future?
Greg Buchner:
I see no reason we won't be. I think from a company-to-company relationship it's a very good relationship and from a person-to-person level it's an excellent relationship. There are a lot of folks on both sides who I think consider themselves friends, not only working together, producing something, but people you can call at home and talk to. Also, we're there if they need anything, they know they can call on us to go do something. You know, we'd move heaven and earth right now to help them with something. So I think it's been a good partnership. A lot of the people have had a good relationship for two generations. A year ago we started talking to factory people, well the same person working with the factory people for the N64 [was helping for GameCube].
You know the same two people show up in a room together, and they already know each other so they've already been through one war. It's easy to start off and hit the ground running. I think we've got the best shot of anybody out there to continue working with them. I see no signs of them backing away from us or vice versa. These are fun products to do, so from a pure engineering point of view this is cool sh**. There's not another thing like this in the world and even if you look at the other consoles, the way the companies are and the way they build consoles is very, very different from the way Nintendo® approaches it. Nintendo® doesn't view themselves as technologists, so they bring in the best technology partners they can to go produce something and they give you the ability to really create almost anything from a clean sheet of paper. To go build the best possible thing you can for where they're focused. And there is a lot of fun to that. So I'm going to do whatever I can to make sure we're in there.
IGNcube: We would definitely like to see that happen.
IGN would like to thank Greg Buchner of ATI for taking time out of his busy schedule to chat with us.
Interview conducted by Fran Mirabella III

http://www.ign.com/a...graphics-part-2


Now lets travel back to May 1999 when IBM and Nintendo announced their $1 billion deal for the Gekko processor.

(IBM PR)

IBM, Nintendo Announce $1 Billion Technology Agreement

IBM 400 MHz Copper Processor To Power Next Nintendo Game Machine
LOS ANGELES, CA - 12 May 1999: -IBM and Nintendo today announced a multi-year $1 billion technology agreement to support Nintendo's next home video game console, code-named "Dolphin."


As part of the agreement, IBM will design and manufacture a unique 400 MHz central processor featuring IBM's industry-leading 0.18 micron copper technology. The chip, dubbed the "Gekko" processor, is an extension of the IBM PowerPC architecture. It's designed to be more powerful than those found in any current or planned home video game entertainment system, providing players with dramatically better graphics and more realistic action.
The processor is in the advanced stages of development, supporting Nintendo's plans for a worldwide launch for the 2000 holiday season.
While the relationship initially involves the development and production of the copper-based processor, the companies will explore the potential use of IBM technology in other Nintendo products as well. The current arrangement calls for IBM to design, manufacture and ship copper processors to Nintendo, with the potential value of the deal exceeding $1 billion.

"Dating from our very first home system in 1983, Nintendo's ongoing commitment is to provide game developers with industry-leading technology to create new game experiences for our players," explains Howard Lincoln, chairman, Nintendo of America. "IBM's new copper-based chip delivers on that commitment like never before, and we've jointly committed to a long-term relationship to assure revolutionary results."
In order to provide more power than Nintendo's current game system chip, the IBM processor leverages IBM's experience with complex system designs to incorporate enhancements specifically required by Nintendo. These include extra on-chip memory and more efficient data management between the processor and the game system's primary graphics chip.

"As customers such as Nintendo develop increasingly sophisticated systems, the complexity of the chips that power them grows dramatically," says Dr. John Kelly, general manager, IBM Microelectronics Division. "Not many companies are able to meet this need. We have the technology, design expertise and manufacturing experience necessary to develop and deliver customized solutions for our customers."
With IBM's advanced copper processor powering the next Nintendo system, developers can create game designs featuring the degree of realism, emotional connection, fantasy or interaction they've always imagined.
"Designing games is an ever-changing process, and this chip with its speed and seamless data flow, will allow us to make even more amazing games, " explains Chris Stamper, chairman and technical director of Rare, Ltd., producer of mega-hit games Goldeneye and Banjo-Kazooie for the N64. "Consumers will love the end result with the upcoming system."

"In my mind, I'd always envisioned what a game like Zelda could look like, and with the N64, I was able to create it," describes Shigeru Miyamoto, Nintendo developer and world-renowned game designer. "Now, with the Gekko processor, I can see an opportunity to take game designs to a new level."
The IBM copper processor will be paired with a revolutionary graphics chip designed by ArtX Inc., one of the world's leading 3D graphics technologists located in Palo Alto, California. The ArtX team, led by chairman, Dr. Wei Yen, includes a number of well known 3D graphics designers.
"The lineup of companies working on Nintendo's next system is hugely exciting," notes Dr. Wei Yen. "The match between Nintendo's know-how in the video game field, and the enormity of what IBM brings to the table can't be matched."

The Nintendo game system processor chips will be manufactured at IBM's high-volume manufacturing facility in Burlington, VT, where copper-based processors have been manufactured and shipped to customers since 1998.



http://www-03.ibm.co...elease/2181.wss



Well that's enough for now--I'm so very excited about Wii U's architecture--which seems to be very much a state secret. We've got rumors flying all over the internet right now about the GPU (and the CPU). I hope my post here, about Gamecube, was at least somewhat interesting and wets your appetite for what's to come (Wii U) from Nintendo--In the form of GAMES--Not specs-- I don't expect Nintendo to reveal *anything* more about the Wii U hardware / architecture -- But hopefully some of those secrets will get leaked.

Discuss.

Edited by parallaxscroll, 24 September 2012 - 10:30 PM.


#2 Tricky Sonic

Tricky Sonic

    Hammer Bro.

  • Members
  • 1,853 posts
  • NNID:Tricky
  • Fandom:
    Sega, Sonic, Warcraft, Final Fantasy

Posted 22 September 2012 - 04:33 AM

I'm pretty happy How It's going as well. I've said before that I think Nintendo is being hush on specs to keep Sony and Microsoft's marketing at bay as well as have them scale their next systems blindly. This way they can't rip on Wii U's power if they don't know it. And they can't gauge How powerful it is.
Posted Image
Check out my video game collection blog at http://genesaturn.blogspot.com/
Feel free to add me as a friend on your 3DS and Wii U as well - Friend Code = 1289-9502-7134 / Nintendo ID - Tricky

#3 parallaxscroll

parallaxscroll

    Spear Guy

  • Members
  • 95 posts

Posted 22 September 2012 - 05:55 AM

I'm pretty happy How It's going as well. I've said before that I think Nintendo is being hush on specs to keep Sony and Microsoft's marketing at bay as well as have them scale their next systems blindly. This way they can't rip on Wii U's power if they don't know it. And they can't gauge How powerful it is.


I tend to agree, I think Nintendo is being quiet on the specs for good reason, reasons you mentioned. It'll be interesting to see the design choices Nintendo has made with the Wii U architecture. I think Sony and Microsoft have no idea what they're (about to be) dealing with.


Oh also, here is another demo for the Gamecube, Mix Core's 'Rebirth' which was mostly pre-rendered CG/FMV.
In the second video (part 2) a small portion of it was rendered in real-time on Gamecube hardware--
See if you guys can spot the real-time portion.






I would bet that Wii U is capable of reproducing all of this in real-time, going by the Japanese Garden demo from E3 2011. Now imagine how good LoZ is going to look in a few years.

:)

Edited by parallaxscroll, 22 September 2012 - 06:55 AM.


#4 NidoTower

NidoTower

    Spiked Goomba

  • Members
  • 13 posts

Posted 23 September 2012 - 06:22 PM

I had a GameCube and I loved it, awesome!

#5 That64Kid

That64Kid

    Red Koopa Troopa

  • Members
  • 63 posts
  • Fandom:
    I love you

Posted 23 September 2012 - 07:15 PM

I loved my GameCube to death. I wish I still had it ;-;

#6 FreakAlchemist

FreakAlchemist

    Spear Guy

  • Members
  • 93 posts
  • Fandom:
    Link Samus MegaMan KidBuu Vegeta

Posted 23 September 2012 - 07:22 PM

I loved my GameCube to death. I wish I still had it ;-;

I have a gamecube and i still love it.I don't know why i just don't use the wii to play gamecube games though.
Posted Image

3DS Friend code: 0602-6262-8935

just pm me if you want to friend me


#7 -JJ-

-JJ-

    Cheep-Cheep

  • Members
  • 116 posts
  • Fandom:
    Mrs. Doubtfire, Madea, Corneria

Posted 23 September 2012 - 07:24 PM

that pic of Miyamoto-san is so lol
Posted Image

#8 NidoTower

NidoTower

    Spiked Goomba

  • Members
  • 13 posts

Posted 23 September 2012 - 08:19 PM

I still use my GameCube for its games, what an amazing system.

This was a great read BTW.

#9 Noonabites

Noonabites

    Piranha Plant

  • Members
  • 813 posts

Posted 23 September 2012 - 11:37 PM

that pic of Miyamoto-san is so lol


I came in here just to say that. I'm so used to seeing him all happy and smiling.. he looks like a housewife trying to tell their daughter how to pose correctly... .-. it's quite endearing..

Edited by Sir Noonabites, 23 September 2012 - 11:37 PM.


#10 parallaxscroll

parallaxscroll

    Spear Guy

  • Members
  • 95 posts

Posted 24 September 2012 - 12:03 AM

For anyone that wants to know "what was real-time and what was pre-rendered CG"
regarding the tech demos shown at Spaceworld 2000, I think I've found the answers.


Credit: Marco 'nAo' Salvi

Zelda Gamecube
Status: Real-time

Star Wars: Rogue Squadron Gamecube
Status: Real-time

Mario 128 Gamecube Demo
Status: Real Time


Luigi's Mansion
Status: Real-time


Metroid Gamecube
Status: Pre-rendered FMV


Wave Race
Status: Pre-rendered FMV


Rebirth]/b]endered FMV

Too Human
Status: Pre-rendered FMV


Banjo-Kazooie Gamecube
Status: Real-time


Perfect Dark Gamecube
Status: Real-time


Meowth's Party
Status: Real-time


[b]Cars Demo

Status: Real-time


Edited by parallaxscroll, 24 September 2012 - 12:11 AM.


#11 Desert Punk

Desert Punk

    Chain Chomp

  • Members
  • 656 posts

Posted 24 September 2012 - 04:41 AM

Its likely Sony and Microsoft know exactly what is in the wii u. They wil have corporate spies who would find out such information. All three companies have their products made by the same company Foxconn and its possibly information has been leaked from there or from developers. Nintendo will likely be one of the first companies to hear the full specification of the next xbox and playstation too.

#12 Keviin

Keviin

    Lakitu

  • Members
  • 2,270 posts
  • Fandom:
    Zelda, Mario, Metroid, Resident Evil

Posted 24 September 2012 - 05:13 AM

I'm pretty happy How It's going as well. I've said before that I think Nintendo is being hush on specs to keep Sony and Microsoft's marketing at bay as well as have them scale their next systems blindly. This way they can't rip on Wii U's power if they don't know it. And they can't gauge How powerful it is.


Most likely true, since they also said something about others stealing their controllers.
No sig.

#13 parallaxscroll

parallaxscroll

    Spear Guy

  • Members
  • 95 posts

Posted 24 September 2012 - 05:20 AM

Its likely Sony and Microsoft know exactly what is in the wii u. They wil have corporate spies who would find out such information. All three companies have their products made by the same company Foxconn and its possibly information has been leaked from there or from developers. Nintendo will likely be one of the first companies to hear the full specification of the next xbox and playstation too.



Good point.
In all my life I've never seen a console shrouded in such mystery regarding the specifications. I'm sure Microsoft and Sony know everything about Wii U and I'm sure Nintendo knows the current status of the next Xbox and PlayStation.

#14 Nollog

Nollog

    Chain Chomp

  • Banned
  • 776 posts
  • NNID:Nollog
  • Fandom:
    Creepy Stalker Girl

Posted 24 September 2012 - 06:59 AM

I hear it's got a Broadway chip in it.
Your hopes are dashed.

Warning: Cannot modify header information - headers already sent by (output started at /home/thewiiu/public_html/ips_kernel/HTMLPurifier/HTMLPurifier/DefinitionCache/Serializer.php:133) in /home/thewiiu/public_html/ips_kernel/classAjax.php on line 328
{"success":1,"post":"\n\n
\n\t\t<\/a>\n\t\t\n\t\n\t\t\n\t\t
\n\t\t\t\n\t\t\t\t


#15 parallaxscroll

parallaxscroll

    Spear Guy

  • Members
  • 95 posts

Posted 24 September 2012 - 07:01 AM

Also, here's some very rare footage of the real-time Meowth's Party Pokemon tech demo:



#16 NidoTower

NidoTower

    Spiked Goomba

  • Members
  • 13 posts

Posted 24 September 2012 - 03:50 PM

how did gc compare to ps2 and xbox in graphics?

#17 Auzzie Wingman

Auzzie Wingman

    Mournblade

  • Members
  • 4,346 posts
  • NNID:AuzzieWingman
  • Fandom:
    Not enough space here

Posted 24 September 2012 - 05:26 PM

Meowth's Party vid crashed my browser.

Not happy.

Trophy Cards are classy too! LOLZIGZAGOON

 

AuzzieWingman.png


#18 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 24 September 2012 - 05:57 PM

how did gc compare to ps2 and xbox in graphics?


cube
Destroyed ps2 in poly count, texture resolutions capability (thanks to s6 texture compression ps2 lacked, hindered by tiny capacity optical discs) cpu power (simultaneous objects, ai's, physics, sported havok just as well as xbox) texture layers, shader support, lighting.

Ps2 supported 5.1 surround (usually only for fmv) fantastic procedural rendering that was nearly unused (state of panic or whatever by rockstar usedvthis for crowd generation) and hd resolutions for very very few now primitive looking games by cube/box standards.

Cube beat xbox in realworld poly performance, texture layers, ram bandwidth (not capacity), this made cube far more suitable for streaming while xbox had hefty loadtimes frequently, or suffered very samey textures. Later relied on hdd to help mitigate the bottleneck (ram bandwidth and gpu cache) that caused this problem.

Xbox cant produce a game like metroid prime.

Xbox had programmable shaders. This means superior effects that constantly improved as time went on and new techniques were invented. The cubes fixed function shaders could not compete with this.

The last big event to happen in the box's lifetime was unified lighting and shading engines. This put the box a level above the cube that it just couldnt match. This put the box near a gen above cube in atmospheric games that featured heavy lighting/shadowing effects.

Cube could not have games like doom iii or chronicles of riddick.

Edited by 3Dude, 24 September 2012 - 06:04 PM.

banner1_zpsb47e46d2.png

 


#19 parallaxscroll

parallaxscroll

    Spear Guy

  • Members
  • 95 posts

Posted 24 September 2012 - 07:32 PM

Pretty much agree with all of the above, thanks 3Dude.

All of the last-gen consoles, and I'm not talking about Dreamcast, had certain strengths and weaknesses compared to each other. At least on paper, spec-wise, and to some extent, in real-world, in-game results.

I think the Gamecube was the best-engineered of all the last-gen consoles. ArtX and IBM did a fantastic job.

I trust AMD (and IBM) will blow our minds with the Wii U.

Meowth's Party vid crashed my browser.

Not happy.


Sorry I don't know how that happened, it works fine for me, I'm using Google Chrome.

I

I hear it's got a Broadway chip in it.
Your hopes are dashed.


This is totally untrue.

Even going by supposed leaked specs, the Wii U CPU is a triple-core 'enhanced' Broadway running at a higher frequency than Broadway in Wii which was clocked at 729 Mhz.

I also wanted to point out that with Gamecube, the originally announced clockspeeds were:
405 (or 400) Mhz for the IBM Gekko CPU -- 202.5 (or 200) Mhz for the ArtX Flipper GPU.
This was changed sometime between 2000 and 2001 to:
486 Mhz CPU, 162 Mhz GPU.
So the engineers lowered the clockspeed of the GPU while raising the clockspeed of the CPU.

#20 Nollog

Nollog

    Chain Chomp

  • Banned
  • 776 posts
  • NNID:Nollog
  • Fandom:
    Creepy Stalker Girl

Posted 24 September 2012 - 07:37 PM

This is totally untrue.

Even going by supposed leaked specs, the Wii U CPU is a triple-core 'enhanced' Broadway running at a higher frequency than Broadway in Wii which was clocked at 729 Mhz.


Okay then.

Warning: Cannot modify header information - headers already sent by (output started at /home/thewiiu/public_html/ips_kernel/HTMLPurifier/HTMLPurifier/DefinitionCache/Serializer.php:133) in /home/thewiiu/public_html/ips_kernel/classAjax.php on line 328
{"success":1,"post":"\n\n
\n\t\t<\/a>\n\t\t\n\t\n\t\t\n\t\t
\n\t\t\t\n\t\t\t\t





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

Anti-Spam Bots!