Jump to content


Photo

Reggie interview comments on the power of wii U


  • Please log in to reply
39 replies to this topic

#21 Ixchel

Ixchel

    Piranha Plant

  • Members
  • 874 posts

Posted 06 June 2012 - 03:49 PM

More fps makes it easier on the eye also the you can't see more detail than 30 fps is a lie . I can tell the difference between 30 to 40 to 50 to 60 to 70 after that I can't tell to much until I get to 120 fps and compare it to 70 .

A quote from I believe Thomas Jefferson was that anything lower than 45 fps will cause stress to the eye.

Note : some games are easier to tell the difference than others .

Jet pilots can tell the difference between fps even when it gets to around I think 900.

I sort of believe this. I saw a comparison between 30 and 60, but it was so minuscule I thought I may had been imagining it. I just don't see how it can be better than higher res, but I suppose I need to see an in game example and not footage.
Posted Image

#22 Nollog

Nollog

    Chain Chomp

  • Banned
  • 776 posts
  • NNID:Nollog
  • Fandom:
    Creepy Stalker Girl

Posted 06 June 2012 - 03:51 PM

You sure it's not power 7 based? I thought it was confirmed to be

Nintendo haven't confirmed it.

inb4 the guy from wherever he is who will quote the twitter tweet from last year.

Warning: Cannot modify header information - headers already sent by (output started at /home/thewiiu/public_html/ips_kernel/HTMLPurifier/HTMLPurifier/DefinitionCache/Serializer.php:133) in /home/thewiiu/public_html/ips_kernel/classAjax.php on line 328
{"success":1,"post":"\n\n
\n\t\t<\/a>\n\t\t\n\t\n\t\t\n\t\t
\n\t\t\t\n\t\t\t\t


#23 Smurfman256

Smurfman256

    Goomba

  • Members
  • 1 posts

Posted 06 June 2012 - 04:18 PM

Nintendo haven't confirmed it.

inb4 the guy from wherever he is who will quote the twitter tweet from last year.


...didn't...a guy from IBM confirm it?
http://www.engadget....nintendo-wii-u/

#24 Guest_TRON_*

Guest_TRON_*
  • Guests

Posted 06 June 2012 - 04:28 PM

Very vague

#25 Usman Mohammad

Usman Mohammad

    Bullet Bill

  • Members
  • 350 posts

Posted 06 June 2012 - 11:15 PM

The thing is, I'm seeing it reversed. Ignoring the truth that a higher base framerate lets it drop and still be playable, but isn't that because 60fps is basically unable to be detected? Sure yeah TVs can run that FPS but can people tell the difference between 30 and 60 when humans biologically see 30FPS?
I'm seeing it as dropping something that can actually be seen, the resolution, for a trade off which you can't see.

I also don't see how the FPS "adds more detail". If anything more pixels does that?

I am saying either add more detail OR increase frame rate.

The resolution is a canvas, if you have a smaller canvas then the GPU needs to do a lot less work and can push out a lot of these canvases per second.

Now from 720P to 1080p you increase the canvas size, the GPU would then need to work more to fill the canvas. The problem here is that your using up resources to get the GPU to generate a 1080p picture. In most PC games that means you add extra scenary, in other games it provides a crisper image.

The point I am trying to make is that console games will probably not add extra scenary and will just provide a crisper image when played on 1080p than 720p. The problem here is that you could have used that extra rendering power of the GPU to add more texture detail to the graphics at 720P or have a more playable frame rate.

I can tell the difference between 30,60 and 120 HZ. When I play BF3 to CoD, CoD is just more fluid, when I was at Eurogamer Expo the TV's on display had clear motion (its to trick you thinking its higher than 120HZ) and the images were more smoother than anything I had seen.

Frame rates do matter, Watch a video at 24fps and then a TV show or youtube video (often at 60fps) and you can tell the difference. What my teacher told me ages ago was that your eyes aren't fixed, it depends on the surroundings, when at the cinema the lights are dimmed so that your eyes only focus on the film and your eyes adjust to that frame rate.

If frame rate didn't matter why was their a split between consumers when they showed Hobbit in 48fps?

#26 Joshua

Joshua

    The Dovahkiin

  • Members
  • 2,334 posts
  • NNID:JoshuaCM
  • Fandom:
    Anything Nintendo, Skyrim and more!

Posted 07 June 2012 - 12:46 AM

Who knows. If it goes the way of the Wii, we still get great Nintendo games
If it can keep up, we get some good third party games too.


That's a good optimistic summary, actually.

Posted Image

Signature by Cerberuz


#27 MorbidGod

MorbidGod

    Hammer Bro.

  • Members
  • 1,717 posts

Posted 07 June 2012 - 05:00 AM

Nintendo has confirmed its Power based, which means its not PowerPC.
Whovian12 -- Nintendo Network ID.

#28 Keviin

Keviin

    Lakitu

  • Members
  • 2,270 posts
  • Fandom:
    Zelda, Mario, Metroid, Resident Evil

Posted 07 June 2012 - 06:37 AM

Those games were in development and the third party are still getting used to the tech.


But I expect Nintendo devs could use it better, but none of their games look anything near nextgen. Look at Pikmin: not bad, but could have been way, way better if the console is indeed 'powerful'.
No sig.

#29 Socalmuscle

Socalmuscle

    Hammer Bro.

  • Members
  • 1,677 posts

Posted 08 June 2012 - 02:23 PM

Wii U uses POWER CPU. that is BETTER than PowerPC.

POWER CPU includes the entire PowerPC instruction set. So the best of both worlds.

#30 Keviin

Keviin

    Lakitu

  • Members
  • 2,270 posts
  • Fandom:
    Zelda, Mario, Metroid, Resident Evil

Posted 09 June 2012 - 10:20 AM

Guys, remember Reggie is a marketing major. He can talk out of his ass to hype ppl up. He said the Zelda tech demo was 1080p (it was 720p), he said E3 was about the games and he gave us a casual fest, and he said at E3 08 that animoa crossing was good enough for the hardcore lol. Reggie doesn't know specs.


I also think Nintendo's vision of a 'powerful' console is different than most of our visions. Because the Wii was underpowered, Nintendo might still think as PS360 graphics as powerful, so if it is just a slight step up from those... Reggie/Nintendo may find it 'tremendously powerful'.
No sig.

#31 MorbidGod

MorbidGod

    Hammer Bro.

  • Members
  • 1,717 posts

Posted 09 June 2012 - 12:55 PM

Until we know we should just wait?
Whovian12 -- Nintendo Network ID.

#32 Stormage09

Stormage09

    Chain Chomp

  • Members
  • 621 posts
  • Fandom:
    Dexter, Walking Dead

Posted 09 June 2012 - 01:12 PM

i dont care about the specs, at the end if you can have all your nintendo first parties plus the best third parties in the wii u with decent graphics im sold, i dont even want the best
Posted Image

#33 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 09 June 2012 - 01:37 PM

Nintendo has confirmed its Power based, which means its not PowerPC.


This is true.

Which automatically confirms power 7 derivitive.

Power 5 was from 2004, and the size of a small child, absolutely not usable in any game system ever.

Power 6 is from 2007, but is basically a duel core grenade if it were to be put.in a console.

Power7 is the one in the watson super.computer, which is where ibm claims the wii u cpu technology came from. Worthy of note, the power 7 returns the power line to out of order processing.

banner1_zpsb47e46d2.png

 


#34 MorbidGod

MorbidGod

    Hammer Bro.

  • Members
  • 1,717 posts

Posted 09 June 2012 - 05:41 PM

3Dude yes you are right. Its based off Power 7. Which is why this hocuspocas of the Wii au being under powered is crap. There is no way they would pay for a great CPU yet skimp on the gpu. I believe we will see great graphics eventually.
Whovian12 -- Nintendo Network ID.

#35 silverismoney

silverismoney

    Cheep-Cheep

  • Members
  • 100 posts

Posted 09 June 2012 - 05:55 PM

POWER DOES NOT MEAN BETTER THAN POWERPC as the power range has a lot of in order cores to the range and there more basic powerpc is out of order power range has both inline and out of order mhz for mhz out of order is better

the very inline cpus are or was known as powerPE the PE cores are exstreamly stripped out LIKE INTEL ATOMS theres are the cores used ny x360 and ps3

nintendo are ether using a broadway 2 32 bit out of oder cpu with 5 exacution units SYSTEM ON CHIP or there using a all new cpu core above both the broadway and the xenon and ether inline or out of order its got to be as custom power6 or a custom power 7 base design WE HAVE GRAPHICS BURST PIPES BUFFERS and total wii gamecube like GX coding i bet its a NINTENDOIZED POWER6

xenon is 2 instructions per clock cycle il take a gamble that wiiu cpu is 5 instructions per clock ///x360 has 1 exacution unit per core broadway has 5 exacution units in a core power6 has 2 ecacution units per core i bet wiiu has a mimimum of 2 if not more

Edited by silverismoney, 09 June 2012 - 05:58 PM.


#36 Joshua

Joshua

    The Dovahkiin

  • Members
  • 2,334 posts
  • NNID:JoshuaCM
  • Fandom:
    Anything Nintendo, Skyrim and more!

Posted 09 June 2012 - 06:01 PM

If only he could be less vague...

Posted Image

Signature by Cerberuz


#37 silverismoney

silverismoney

    Cheep-Cheep

  • Members
  • 100 posts

Posted 09 June 2012 - 06:15 PM

OH JUST A THOUGHT

will wiiu have fast ram i think it may as in a direct replacment to 1t sram maybe fc ram or even IBM EDRAM used as main memory in the same way 1t sram was used i keep thinking theres 3 mem pools 1 edram,,,2 fast ram,,,,3 gddr3 ram....... maybe 32mb / 1gb/512mb

#38 Ixchel

Ixchel

    Piranha Plant

  • Members
  • 874 posts

Posted 09 June 2012 - 06:34 PM

I am saying either add more detail OR increase frame rate.

The resolution is a canvas, if you have a smaller canvas then the GPU needs to do a lot less work and can push out a lot of these canvases per second.

Now from 720P to 1080p you increase the canvas size, the GPU would then need to work more to fill the canvas. The problem here is that your using up resources to get the GPU to generate a 1080p picture. In most PC games that means you add extra scenary, in other games it provides a crisper image.

The point I am trying to make is that console games will probably not add extra scenary and will just provide a crisper image when played on 1080p than 720p. The problem here is that you could have used that extra rendering power of the GPU to add more texture detail to the graphics at 720P or have a more playable frame rate.

I can tell the difference between 30,60 and 120 HZ. When I play BF3 to CoD, CoD is just more fluid, when I was at Eurogamer Expo the TV's on display had clear motion (its to trick you thinking its higher than 120HZ) and the images were more smoother than anything I had seen.

Frame rates do matter, Watch a video at 24fps and then a TV show or youtube video (often at 60fps) and you can tell the difference. What my teacher told me ages ago was that your eyes aren't fixed, it depends on the surroundings, when at the cinema the lights are dimmed so that your eyes only focus on the film and your eyes adjust to that frame rate.

If frame rate didn't matter why was their a split between consumers when they showed Hobbit in 48fps?

I sort of get where you're going here. The problem is that there is a disconnect between us because I have never seen a noticeable difference in FPS unless I'm on the PC and it drops below 30, which is when I see it as "lag". So although I understand your reasoning behind it, I've never personally seen FPS 30 vs 60 affecting gameplay or a video. I've also never heard any criticism about the Hobbit or about it being shown in a different fps.
I've heard that movies are shown at a different framerate than shows but I guess because of the lighting you brought up, it's never looked different to me. It's like arguing to a blind person that company A provides a better paint than company B. I've just never been into the whole FPS scene so I've never looked into it lol. I tried looking up comparison videos between 30 and 60 but didn't find any difference. Unless you have examples I'll just have to trust what you're saying. :P
Posted Image

#39 silverismoney

silverismoney

    Cheep-Cheep

  • Members
  • 100 posts

Posted 09 June 2012 - 07:02 PM

the old this framerate that framerate issue a bit like tha whole colour bit war at the end of the day the human eye cannot really detect any frames over 30 a second so 30 is good enough and 60 far more than enough I GET 100 FRAMES A SECOND ON MY PC SORRY THATS MEDICAL FACT OPTICAL FACT FRIGGING MEANINGLESS

the same goes for colour i get 128 bit colour oh i get 256bit color xbox 1 supprts 32 bit color 24 BIT COLOUR = EVERY SINGLE TV SET ON EARTH AND 24 BIT = TRUE COLOUR AS SEEN BY THE HUMAN EYE

fact 1 tvs are 24 bit colour fact 2 24 bit color is as the human eye see's colour so what reason would you use 64bit colour FANBOYS DO TALK A LOT OF POO

More fps makes it easier on the eye also the "you can't see more detail than 30 fps" is a myth . I can tell the difference between 30 to 40 to 50 to 60 to 70 after that I can't tell to much until I get to 120 fps and compare it to 70 .

A quote from I believe Thomas Jefferson was that anything lower than 45 fps will cause stress to the eye.

Note : some games are easier to tell the difference than others .

Jet pilots can tell the difference between fps even when it gets to around I think 900.



i hope you realize that what your saying is total horse carp the human eye cannot tell at 60 frames 900 frames and fighter jets i go get my unicorn and fly up to your jet and we can keep this fantasy going oh by the way i officially have 20/10 vision as do a lot of fighter pilots 900 FRAMES my butt crack

#40 MorbidGod

MorbidGod

    Hammer Bro.

  • Members
  • 1,717 posts

Posted 09 June 2012 - 08:08 PM

POWER DOES NOT MEAN BETTER THAN POWERPC as the power range has a lot of in order cores to the range and there more basic powerpc is out of order power range has both inline and out of order mhz for mhz out of order is better

the very inline cpus are or was known as powerPE the PE cores are exstreamly stripped out LIKE INTEL ATOMS theres are the cores used ny x360 and ps3

nintendo are ether using a broadway 2 32 bit out of oder cpu with 5 exacution units SYSTEM ON CHIP or there using a all new cpu core above both the broadway and the xenon and ether inline or out of order its got to be as custom power6 or a custom power 7 base design WE HAVE GRAPHICS BURST PIPES BUFFERS and total wii gamecube like GX coding i bet its a NINTENDOIZED POWER6

xenon is 2 instructions per clock cycle il take a gamble that wiiu cpu is 5 instructions per clock ///x360 has 1 exacution unit per core broadway has 5 exacution units in a core power6 has 2 ecacution units per core i bet wiiu has a mimimum of 2 if not more


http://www-03.ibm.com/press/us/en/pressrelease/34683.wss

IBM most advanced technology, specifically announcing a feature from the Watson/Power7 CPU line (lots of embedded dram). The Power7 is a lot more powerful then what's in the Xbox 360. Is the Wii U using a Power7? Probably not. Its probably using a custom CPU with specific features to help it in gaming, since the Power7 is mostly used in servers.

but I highly doubt, whatever the case, that Nintendo is using a PowerPC G3 processor which is what the Broadway is based off of. To even think that Nintendo only added a second core to the Wii is almost illogical, since nothing you have presented proves that theory.
Whovian12 -- Nintendo Network ID.




1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users

Anti-Spam Bots!