Jump to content


Photo

how to develop for Wii U detailed... Shin'en multimedia(Nano Assault)


  • Please log in to reply
50 replies to this topic

#41 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 30 May 2013 - 08:36 PM

Not really. Intel chips work with amd GPu.

And amd works fine with power too.

I'm guessing that nvidia was just too expensive. And so was intel.

BINGO



#42 Socalmuscle

Socalmuscle

    Hammer Bro.

  • Members
  • 1,677 posts

Posted 30 May 2013 - 08:48 PM

BINGO


Lol

It would also explain nvidia being so bent over nobody choosing them.

And then openly stating the amd solution in the xbox is inferior to the and solution in the Playstation. Actually kinda funny and immature that they'd be so hurt as to let that loose.

#43 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 30 May 2013 - 08:50 PM

Lol

It would also explain nvidia being so bent over nobody choosing them.

And then openly stating the amd solution in the xbox is inferior to the and solution in the Playstation. Actually kinda funny and immature that they'd be so hurt as to let that loose.

Sony did approach them, and wasn't willing to pay, that's why the came out and said what they said.



#44 Socalmuscle

Socalmuscle

    Hammer Bro.

  • Members
  • 1,677 posts

Posted 30 May 2013 - 08:52 PM

Sony did approach them, and wasn't willing to pay, that's why the came out and said what they said.


And I'm sure Microsoft did also in the homework stages. And nvidias comment actually hurt ms a bit in public mindshare.

#45 routerbad

routerbad

    Lakitu

  • Section Mods
  • 2,013 posts
  • NNID:routerbad
  • Fandom:
    Zelda, Mario, Halo, Star Trek

Posted 30 May 2013 - 08:54 PM

And I'm sure Microsoft did also in the homework stages. And nvidias comment actually hurt ms a bit in public mindshare.

Very true.



#46 MorbidGod

MorbidGod

    Hammer Bro.

  • Members
  • 1,717 posts

Posted 31 May 2013 - 05:01 AM

Consider if you shoved code for the wii u architecture Into the 360. It probably wouldn't run at all.

We are talking fundamental architectural differences. You'd have to really rework the code for x1 and ps4 as well.

And no wii u ports of old games look worse, no matter how poorly the port was done. Not that I've seen anyway.

I don't know if the wii u will be neck and neck with the ps4, but it will be very competitive. Many games will likely look the same on both systems. The wii u will likely have more loading times due to the ram amount, but the processing is excellent and the ram layout very efficient.



Jaguars kinda suck actually. But that's why there are 8 of them. Lol

I'm sure both companies would have loved to go power or intel, but amd likely gave them a sweet deal to go apu.


I'm sure Sony didn't want power because they wanted to make it easy to develop for. But, ultimately, that's exactly right why Intel and NVIDIA wasn't choosen.
Whovian12 -- Nintendo Network ID.

#47 Cloud Windfoot Omega

Cloud Windfoot Omega

    Cheep-Cheep

  • Members
  • 148 posts

Posted 31 May 2013 - 06:41 AM

Not really. Intel chips work with amd GPu.

And amd works fine with power too.

I'm guessing that nvidia was just too expensive. And so was intel.

thats not the point, maybe you should rethink what i said. Ibm and  Amd would be fine but Amd would  most certainly not work with Intel, the company who activly (and illegially)  tried(and may still) force them off the cpu market.



#48 Goodtwin

Goodtwin

    Bullet Bill

  • Members
  • 356 posts

Posted 31 May 2013 - 08:11 AM

Very, VERY few GAF members have a grasp on what the Jaguar cores are actually capable of, these are the same people that thought that a low clocked chip with high IPC and efficiency couldn't exist precisely until the PS4 clocks were announced.

 

It is rather comical.  The same people that were bashing the Wii U for its tri core low clocked cpu, who were trying to convince everyone that this would cripple the system in the physics aned AI department truly became silent once the Jaguar cpu was announced.  Its amazing how the CPU can go from be vital to a next gen system to nobody wanting to even talk about it.  Now its all about the ram and GPU and suddenly CPU is an afterthought.  Why was it so important when everyone assumed the PS4 and X1 would have high end cpu's but now that it has a tablet cpu its of no consequence?  Its pretty comical.  Not to mention that the Wii U cpu is pretty darn efficient, its definately no I5 or I7 cpu, but its definately far more capable than any of the Gaf haters wanted to admit.

 

The developer not correctly using the huge caches properly is not a hardware problem, its a coding problem.  If the developer isnt making 100% use of the large caches , then dont blame the hardware.  If your constatly stalling the cpu with cache misses, and you have 3MB of cpu cache at your disposal, then the developer needs to look for the problem, because the CPU should not be completeing work faster than you code fills the cache.  Same goes for the edram in the GPU, dont complain because your code doesnt make good use of the edram.  The GPU is centered around this very tight memory management.  I dont think very many developers have really grasped how this works, but those that do will find huge performance gains. 



#49 Cloud Windfoot Omega

Cloud Windfoot Omega

    Cheep-Cheep

  • Members
  • 148 posts

Posted 31 May 2013 - 08:59 AM

It is rather comical.  The same people that were bashing the Wii U for its tri core low clocked cpu, who were trying to convince everyone that this would cripple the system in the physics aned AI department truly became silent once the Jaguar cpu was announced.  Its amazing how the CPU can go from be vital to a next gen system to nobody wanting to even talk about it.  Now its all about the ram and GPU and suddenly CPU is an afterthought.  Why was it so important when everyone assumed the PS4 and X1 would have high end cpu's but now that it has a tablet cpu its of no consequence?  Its pretty comical.  Not to mention that the Wii U cpu is pretty darn efficient, its definately no I5 or I7 cpu, but its definately far more capable than any of the Gaf haters wanted to admit.

 

The developer not correctly using the huge caches properly is not a hardware problem, its a coding problem.  If the developer isnt making 100% use of the large caches , then dont blame the hardware.  If your constatly stalling the cpu with cache misses, and you have 3MB of cpu cache at your disposal, then the developer needs to look for the problem, because the CPU should not be completeing work faster than you code fills the cache.  Same goes for the edram in the GPU, dont complain because your code doesnt make good use of the edram.  The GPU is centered around this very tight memory management.  I dont think very many developers have really grasped how this works, but those that do will find huge performance gains. 

we still do not know  %100 what the cpu or gpu is. Hell its not even been testef or  performance. How anyone is accually comeing up with the numbers they are getting from just the  scans is a tad silly to be throwing around as fact. So saying its not as strong as a i5 or i7 is still a little premature.



#50 Socalmuscle

Socalmuscle

    Hammer Bro.

  • Members
  • 1,677 posts

Posted 31 May 2013 - 09:23 AM

thats not the point, maybe you should rethink what i said. Ibm and  Amd would be fine but Amd would  most certainly not work with Intel, the company who activly (and illegially)  tried(and may still) force them off the cpu market.

 

Dude. chill.

 

this is what you said:

 

 

"... well  no one wanted to go Nvidia, so that would rule out intel anyway.
 
plus in my  experience,  intel plays rough."

 

 

 

I basically pointed out that the idea that no Nvidia=no Intel is a fallacy.  That's all.  

 

And YES, Intel CPUs play nicely with AMD GPUs.  But each company wants obviously to see only their hardware in a system. However, a vendor choosing one does not force them to exclude the other. AMD was cheaper. Though their CPUs are not as good, they are cheaper. that's it.

 

Ruling out a GPU company does not in any way rule out a CPU company. And that really was the point. Maybe you should rethink what you said.

 

 

last December, AMD sent GPUs to be benchmarked by an independent company for review.  Guess which processor they sent with it... INTEL Core i7. why? Because their CPUs would make their GPU perform worse.

 

http://techreport.co...radeon-hd-8790m

 

Even AMD knows that Intel has the better CPU. 

 

APU was chosen because of cost.



#51 MorbidGod

MorbidGod

    Hammer Bro.

  • Members
  • 1,717 posts

Posted 31 May 2013 - 03:38 PM

thats not the point, maybe you should rethink what i said. Ibm and Amd would be fine but Amd would most certainly not work with Intel, the company who activly (and illegially) tried(and may still) force them off the cpu market.


AMD can always make their own architecture. Or license ARM architecture, and become strictly mobile. Which, with Apple rumored to go all in ARM, Microsoft allowing ARM for Windows RT, They would be just fine.

Dude. chill.

this is what you said:


"... well no one wanted to go Nvidia, so that would rule out intel anyway.

plus in my experience, intel plays rough."




I basically pointed out that the idea that no Nvidia=no Intel is a fallacy. That's all.

And YES, Intel CPUs play nicely with AMD GPUs. But each company wants obviously to see only their hardware in a system. However, a vendor choosing one does not force them to exclude the other. AMD was cheaper. Though their CPUs are not as good, they are cheaper. that's it.

Ruling out a GPU company does not in any way rule out a CPU company. And that really was the point. Maybe you should rethink what you said.


last December, AMD sent GPUs to be benchmarked by an independent company for review. Guess which processor they sent with it... INTEL Core i7. why? Because their CPUs would make their GPU perform worse.

http://techreport.co...radeon-hd-8790m

Even AMD knows that Intel has the better CPU.

APU was chosen because of cost.


Exactly.

Although, their GPU's are amazing. I have always preferred ATi over NVIDIA.
Whovian12 -- Nintendo Network ID.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

Anti-Spam Bots!