Jump to content


Photo

Wii U specs are surprising, says Sonic developer


  • Please log in to reply
29 replies to this topic

#21 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 25 August 2012 - 06:00 AM

Well in terms of RAM, I think they should go with ddr3 (which they probably will) as its the most efficient for gaming.

But in terms of 3rd Party support, I'm saying that Nintendo has made it easier for developers to bring their games over with better coding. And in no way did I mention Wii, but the reason the Wii got minimal 3rd party support is because of the older, weaker architecture implemented. With the Wii U, Nintendo is using architecture from 2010-2011 and powerful enough to make other Next-Gen games port onto the platform, or even co-develop!


Gddr3 was pretty good.... 7 years ago. But its nowhere remotely close to the most effecient for gaming. It is desirable for its performance related to its cost.... But not so much anymore.

Gamecubes 1tsram (and wiis) outperforms ddr3 heavily because of its low latency.

3ds's fcram absolutely destroys ddr3, as it has low latency AND the ability to double pump like ddr ram.
ddr4 is considerably better than ddr3, and ddr5 is amazing.

banner1_zpsb47e46d2.png

 


#22 Foot

Foot

    The most badass sociopath to ever exist.

  • Members
  • 1,038 posts
  • NNID:DPapcinEVO
  • Fandom:
    Sock Wars, Shoehorn Leghorn

Posted 25 August 2012 - 07:36 AM

Wow I was thinking ddr5, but when I looked it up on Google It said ddr3 was better. SO MY INSTINCT WAS RIGHT!!!
I am the foot. I do not like you. You smother me with socks and shoes, then step on me thousands of times a day.

We foot will rebel one day.

#23 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 25 August 2012 - 02:55 PM

Wow I was thinking ddr5, but when I looked it up on Google It said ddr3 was better. SO MY INSTINCT WAS RIGHT!!!


That's a common misconception.

Ddr5's strength comes from its ability to be clocked much higher than ddr3, and its ability to quad pump data while ddr3 double pumps.

Unfortunately, quadpumping benefits are limited by bus size, something people dobt consider. Its not like you can just plop ddr5 in and expect increased performance.

Though the downside is that ddr5 is roughly 20% less effecient than gddr3. So when clocked similarly, gddr5 gets the short end of the stick.

However, when clocked to the proper specifications, and with a card with a bus to take advantadge of the quad pumping ddr5 was designed for, its a complete and absolute slaughter.

ddr5 is most impressive. But by no means suited to a games console environment.

Edited by 3Dude, 25 August 2012 - 02:59 PM.

banner1_zpsb47e46d2.png

 


#24 Ravyu

Ravyu

    Red Koopa Troopa

  • Members
  • 55 posts
  • Fandom:
    Mario, Batman, Link and NINTENDO

Posted 26 August 2012 - 04:58 AM

not suprised

touche
A rare 12 Year Old that thinks Nintendo is godly.
PC Specs- http://pcpartpicker.com/p/hUYw

#25 Socalmuscle

Socalmuscle

    Hammer Bro.

  • Members
  • 1,677 posts

Posted 26 August 2012 - 12:46 PM

That depends on your interpretation of far more powerful.

Its just not in the cards to have an uber leap in power again at this point in time
... For so many reasons. The most damning, has been the fact that the reliance on, or rather, the full on abuse of Moores law has finally reached its consequences, as Dennard scaling hits the inevitable wall of physical limits. As of right now, we can't simply continue advancing in power the same easy way we did from 1970-2005/6. Don't mistake what I'm saying. This is very much a NOW problem, silicon integrated circuitry, as we've been using it for the past 30+ years has hit its physical limits for the easy power upgrades so many are SO very used to enjoying. Its been going on so long that people just seem to take it for granted.

Once we overcome the silicon problem (graphene? optical?), we will be back on track, with a performance rate growth similar to what we enjoyed with dennard scaling.
Maybe even in time for the next consoles!

(Personally, I'm expecting a 350 wii u price point.)


I refer to the hardware. In which case, the Wii U is demonstrably far more powerful than the current generation. As it should be.

Will that translate to seeing an enormously obvious leap in terms of what we see on screen? No. but it will still be quite a bit better and obviously so.

"Moore's 'law'" was more a theory than anything else, though it played out very well for a long time. It should have been "Moore's observed trend" instead. LOL

and it relied on people working on chips to make it happen. Obviously the fallibility of man combined with varying talent levels, resources, economies, market, etc. would see to it that it would not be a perfect law.

Still, from the technology at hand now, the Wii U is far more powerful in every way. Will it be reveal in a leap forward the likes of which were seen by the 360 vs original xbox? Doubtful. But I wouldn't put it past GCN from N64 type of leap considering the hardware. Sure, the GamePad has to cost the hardware while it's doing its thing (despite what AMD would say), but there are ways for creative types to go full throttle on the graphics with minimal gamepad parasitic drag.

In other words, I agree with where you are coming from, but at the same time, I believe "far more powerful" to be accurate, though by far, I mean from California to Texas. Not from California to Israel, if you get my drift.

Really interested to see what the Crytek folks are doing with Wii U. If there is an engine looking for hardware to let loose on, that's it.

Edited by Socalmuscle, 26 August 2012 - 12:51 PM.


#26 CUD

CUD

    Super Saiyan Dingus

  • Members
  • 1,337 posts
  • NNID:CUDesu
  • Fandom:
    Gaben

Posted 26 August 2012 - 04:24 PM

They just had low expectations.

This statement is false. The previous statement is true.

RIP in peace Nintendo.

cCIImXL.png


#27 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 27 August 2012 - 08:29 AM

I refer to the hardware. In which case, the Wii U is demonstrably far more powerful than the current generation. As it should be.

Will that translate to seeing an enormously obvious leap in terms of what we see on screen? No. but it will still be quite a bit better and obviously so.

"Moore's 'law'" was more a theory than anything else, though it played out very well for a long time. It should have been "Moore's observed trend" instead. LOL

and it relied on people working on chips to make it happen. Obviously the fallibility of man combined with varying talent levels, resources, economies, market, etc. would see to it that it would not be a perfect law.

Still, from the technology at hand now, the Wii U is far more powerful in every way. Will it be reveal in a leap forward the likes of which were seen by the 360 vs original xbox? Doubtful. But I wouldn't put it past GCN from N64 type of leap considering the hardware. Sure, the GamePad has to cost the hardware while it's doing its thing (despite what AMD would say), but there are ways for creative types to go full throttle on the graphics with minimal gamepad parasitic drag.

In other words, I agree with where you are coming from, but at the same time, I believe "far more powerful" to be accurate, though by far, I mean from California to Texas. Not from California to Israel, if you get my drift.

Really interested to see what the Crytek folks are doing with Wii U. If there is an engine looking for hardware to let loose on, that's it.


Moores law relied on a dennards cpu scaling, which is real and tangible.

in 1974 dennard postulated that MOSFETs continue to function as voltage-controlled switches while all key figures of merit such as layout density, operating speed, and energy efficiency improve – provided geometric dimensions, voltages, and doping concentrations are consistently scaled to maintain the same electric field.

This is where moores law comes up with tra.sistor density doubling every 2 years. It was his observation of dennard scaling with the advanement of manufacturing capabilities.

This is whefe the large majority of progress from 1974-2003/4 came from. Simply increasing transistor density and decreasing size, allowing for increased clock speeds as a nice boon too.

Moore estimated that his observation would only be sustainable at the rate he saw for 10 years. It went on for 30.

Everything practically in power grew linearly with time.... It was like a free lunch buffet for power increases.

Until we got into the early 2000.

We started hitting the limit of physics, the designs were too small and began leaking current into the substrate, and we were no longer able to mantain the same eletric field.... We couldn't double the transistor count in the same amount of time anymore... and soon we wouldn't be able to at all.

the buffet closed.

That's when other existing but largely ignored technologies began to be brought to the fore front.

First up was a concept called concurrency.

You already know what concurrency is, just not the old unmarketed name for it. Concurrency is splitting the software into seperate lines of work to be processed in parallel. Software multi threading. Unfortunately, not all tasks can be processed in parallel, so the gains received were rather small. Around 15% increases in performance. Then came chips DESIGNED for concurrency, with seperate threads in the hardware, this was marketed most known as hyper threading. Depending on how well optimized the software was, hains were generally logged from 15-40%. Very nice, but a far cry from the 2x gains of the free lunch era.

Then came processor concurrency, multicore, which gave much better boosts in gains. For a while, we were seeing progress close to what it was before...

Unfortunately, that was destined to be short lived.... For each core added the gains become less and less due to overhead like core and thread coherency, and amdahls law of parallel processing.

A 4 core at 2Ghz per core is nowhere near as powerful as a single core processor @8GHz (with the transistor count of all 4 cores).

Engineers are constantly, desperately trying to find new wells to dip into, now they're trying 'dark silicon', the problem is, each well is shallower than the last, and inherently more expensive... And all are little more than a drop compared to the heyday of dennard scaling.

Engineers are working harder and harder to scrape up smaller and smaller gains in power.

Silicon on insulator technology has pretty much hit its physical limit. Advancements will be considerably slower until a new material technology is found that allows dennard scaling, or something similar to kick advancement into high gear again.

That reality applies to game consoles too.

Its just not in the cards to advance games consoles this time like we've been doing for the past 30 years.

banner1_zpsb47e46d2.png

 


#28 Socalmuscle

Socalmuscle

    Hammer Bro.

  • Members
  • 1,677 posts

Posted 27 August 2012 - 12:55 PM

http://news.cnet.com..._3-1014887.html

decent article on moore's law - which by definition is not scientific or natural law by virtue of its currently non-applicable status.

was a nice theory that held true for a long time.

Any "law" that depends on people getting things right to make it happen is treading on thin ice as it is.

Edited by Socalmuscle, 27 August 2012 - 12:56 PM.


#29 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 27 August 2012 - 02:58 PM

http://news.cnet.com..._3-1014887.html

decent article on moore's law - which by definition is not scientific or natural law by virtue of its currently non-applicable status.

was a nice theory that held true for a long time.

Any "law" that depends on people getting things right to make it happen is treading on thin ice as it is.


Moores law was coined by caltech professor carver mead, as a short snappy way to refer to moores
1965 publishment 'cramming more components onto integrated circuits'.

It was never anything more than a catch phrase, and moore himself only figured it would hold true for about
10 years. Sorry for any confusion that may have caused.

banner1_zpsb47e46d2.png

 


#30 Socalmuscle

Socalmuscle

    Hammer Bro.

  • Members
  • 1,677 posts

Posted 27 August 2012 - 03:47 PM

Moores law was coined by caltech professor carver mead, as a short snappy way to refer to moores
1965 publishment 'cramming more components onto integrated circuits'.

It was never anything more than a catch phrase, and moore himself only figured it would hold true for about
10 years. Sorry for any confusion that may have caused.


No worries. I knew where you were coming from. The term "Law" in "Moore's Law" was obviously a tongue in cheek term at its inception. But along the way, people forget and all they hear is "law," as if it were somehow an immutable law of the universe. LOL.

I think we ended up getting there from the relative term I used earlier, "far more powerful." And you were thinking of a scale I had not intended, therefore, bringing "Moore's Law" into play to communicate why a gigantic leap (the likes of Xbox to 360) in visible power wouldn't be available, which we agree on (a challenge also facing Microsoft and Sony). At least that's how I understand our arrival at this point. Feel free to correct if I am misreading.

I always enjoy the posts you put up. Very thoughtful, factual, and intelligent.




1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users

Anti-Spam Bots!