Jump to content


Photo

Interesting new article on Wii U's RAM bandwidth


  • Please log in to reply
53 replies to this topic

#41 Chaz

Chaz

    Red Koopa Troopa

  • Members
  • 50 posts
  • Fandom:
    Pikmin, Animal Crossing

Posted 27 January 2013 - 11:43 AM

Your reading comprehension is bad, he clearly seperates speculation from facts.


I absorbed the article just fine, thank you. Regardless, His "facts" portion of the article starts off citing an anonymous developer, theories on how much eDRAM the Wii U contains and where he thinks it is and ends with a "possible" memory diagram. I don't see any factual information there which is the crux of my problem with this article: lack of facts and completely unsubstantiated. Perhaps you ought to read it again yourself.

Having named developers straight up say 'no, its not bandwidth starved' is neither vauge nor media soundbites.


That is extremely vague and could be easily taken out of context. Developers are constantly making statements that contradict what others have said, so the context of the statement is important.

No, you arent.


Ah right, so I suppose I should jump into homebrew like you so that I can pull technical stats out of my rear end to make myself sound important on game forums? That's a great way to win arguments with kids, isn't it? ;)

Edited by Chaz, 27 January 2013 - 09:39 PM.


#42 cannonshane

cannonshane

    Piranha Plant

  • Members
  • 925 posts
  • Fandom:
    Luigi

Posted 27 January 2013 - 08:57 PM

When are people going to get a life and get over this sh1t

Staff Writer at http://www.allagegaming.com/

 

Strayaaaaaaaaaa Mate


#43 Dusean17

Dusean17

    Blooper

  • Members
  • 184 posts
  • Fandom:
    PATAPON | LBP | DIGIMON | ZELDA | KIRBY

Posted 27 January 2013 - 09:27 PM

I heard the next MS and Sony consoles will have double the RAM bandwith of the U.

Probably will just like last generation....>_>
I don't understand what more they can do in terms of graphics....it would be awesome if we have in-game graphics comparable to movies.

Edited by Dusean17, 27 January 2013 - 09:27 PM.

PlayStationUnityampNintendoUnitySingleSi

To visit my Nintendo Channel click the link above, to visit my PlayStation Channel click here.


#44 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 28 January 2013 - 04:50 AM

I absorbed the article just fine, thank you.

We will see.

Regardless, His "facts" portion of the article starts off citing an anonymous developer, theories on how much eDRAM the Wii U contains and where he thinks it is and ends with a "possible" memory diagram.

Well that didnt go very far. The 'facts' you erroneously thought were describing the wii u are simply facts on ram in general. Which is why the citations come from various sources like xbox360 documentations and interviews. He simply applied that reasoning logically to wii u. These things dont need to be 'proven', they are common knowledge.

I don't see any factual information there which is the crux of my problem with this article: lack of facts and completely unsubstantiated.

And thats why your reading comprehension is poor.... and ive got some REALLY bad news for you. but that will come later.

Perhaps you ought to read it again yourself.
You are so witty.


That is extremely vague and could be easily taken out of context.

Only if you have knowledge base on the material whatsoever.

Developers are constantly making statements that contradict what others have said, so the context of the statement is important.

No, they arent. The only devs who have spoken about the memory all state their is no bandwidth issue. Micheal Ancel Manfred Lizzner, and Joel Kikunen all say the same thing. All though, frozenbytes joel did add a peice of additional context. He compared it directly to 360/ps3 and said it outperformed them. Guess thats why wii u got the pc version of trine 2 instead of the watered down ps360 version.



Ah right, so I suppose I should jump into homebrew like you so that I can pull technical stats out of my rear end to make myself sound important on game forums? That's a great way to win arguments with kids, isn't it?

Oh, yay, ad hominem. Id actually suggest college, cant exactly simply jump in to 'homebrew' when you cant actually do anything.


This 'article' is very, very, very, very, basic engineering. Its obvious, painfully obvious, and the fact so many self professed 'well versed' experts are taking issue to it, is, to put it frankly, embarrassing.

But i have some bad news for those individuals, time has already proven this article correct.

as was stated in the beginning of this article.

'This underachievement on paper in front of systems out 7 years earlier worries the “spec-aware” gamers about the Wii U viability.... there are apprehensions concerning the Wii U ability to quickly manage the CPU and GPU accesses to large amount of data such as detailed textures stored in the RAM, essential for technically ambitious games. Some of them consist of free roaming in huge spaces that entail hefty data streaming and transfers. As a result, could this bandwidth turns into an obstacle for the Wii U to obtain the next Elder Scrolls or GTAVI without excessive downscale'

And what anandtech are worried about is true, a system relegated to a 12Gb a second bandwidth would be incapable of games like skyrim, dragons dogma, red dead, etc. with the same scale and fidelity as the 360/ps3.

Enter nintendo direct.
Posted Image
Oh snap.

Posted Image
Whoopsies.

Edited by 3Dude, 28 January 2013 - 05:13 AM.

banner1_zpsb47e46d2.png

 


#45 Nollog

Nollog

    Chain Chomp

  • Banned
  • 776 posts
  • NNID:Nollog
  • Fandom:
    Creepy Stalker Girl

Posted 28 January 2013 - 08:36 AM

You are so witty.

"Before the release of its next-gen rivals, the Wii U is the console with the most memory, so much that developers like Ubisoft’s Michel Ancel praised this volume."
This states that the above people praised the amount of RAM, not the speed, which is the issue.

"... It’s the same flattering portray for latency, the main recipient of Shin’en’s Manfred Linzner compliments on the system inour exclusive interview."
Let's look at exactly what they said:
"When testing our first code on Wii U we were amazed how much we could throw at it without any slowdowns, at that time we even had zero optimizations. The performance problem of hardware nowadays is not clock speed but ram latency. Fortunately Nintendo took great efforts to ensure developers can really work around that typical bottleneck on Wii U. They put a lot of thought on how CPU, GPU, caches and memory controllers work together to amplify your code speed. For instance, with only some tiny changes we were able to optimize certain heavy load parts of the rendering pipeline to 6x of the original speed, and that was even without using any of the extra cores."
"We didn’t have such problems. The CPU and GPU are a good match. As said before, today’s hardware has bottlenecks with memory throughput when you don’t care about your coding style and data layout. This is true for any hardware and can’t be only cured by throwing more megahertz and cores on it. Fortunately Nintendo made very wise choices for cache layout, ram latency and ram size to work against these pitfalls. Also Nintendo took care that other components like the Wii U GamePad screen streaming, or the built-in camera don’t put a burden on the CPU or GPU."

"Q: On the lauded general organization of the system memory, surely you’re referring in part to the important role of the edram in the Wii U? And about the ram, are other parameters than latency such as bandwidth favorable compare to current platforms?
A: I’m not able to be precise on that but for us as a developer we see everything works perfectly together. Many systems in the past forced the programmers to shift around their data and code quite a lot to fight against latency."
This is pretty much saying that the GPGPU design is the saving grace for the system.

"In general all those DRAM numbers are not of much importance. Much more for example are the size and speed of caches for each core in the CPU. Because DRAM is always slow compared to caches speed."
Way to go sherlock. Probably not even a developer.
Then the writer goes on about edram, which isn't new and 32MB is the best!
Then reaching at straws with it's new! a memory controller is better when it's new!
Then the writer cuts the 360's bandwidth in half by having their old lame controller be controlling the flow of data like an old old old.
But the Wii U's got chips all over the place connected by seperate buses and memory controllers so it's really super-fast! and new!
Followed by baseless peculation about using the ram as a scratchpad.
And baseless speculation about DDR3 > GDDR3 when that's just plain silly. Again using the it's new! excuse.
AMD and Nintendo secretly made a secret texture secret that AMD didn't think of as the guys who have been making gpu's for decades and it's only on Nintendo!
I'll ignore the anonymous developer conversation, because of the one they quoted earlier obviously being an idiot.
"we had no issues at all with memory bandwidth on Trine 2: Director’s Cut."
The writer tacks on the implication that what frozenbyte meant was wii u is not bandwidth starved, but we can't say that because of nda's :'( nintendo want you to think it's weak.
What a silly thing to tack on to someones simple statement saying they haven't had problems developing for the system, within its limitations. Like every other console ever, except the ps3's cell.
Followed by more stuff of little value.

I agree with shin'en though, I have faith that Nintendo didn't just grab the cheapest chips they could find and glue them to a cardboard circuit board, the truth on how the ram is managed will be revealed some day.
Bandwidth is a completely separate thing from how well the memory is organised though.

Warning: Cannot modify header information - headers already sent by (output started at /home/thewiiu/public_html/ips_kernel/HTMLPurifier/HTMLPurifier/DefinitionCache/Serializer.php:133) in /home/thewiiu/public_html/ips_kernel/classAjax.php on line 328
{"success":1,"post":"\n\n
\n\t\t<\/a>\n\t\t\n\t\n\t\t\n\t\t
\n\t\t\t\n\t\t\t\t


#46 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 28 January 2013 - 09:41 AM

"Before the release of its next-gen rivals, the Wii U is the console with the most memory, so much that developers like Ubisoft’s Michel Ancel praised this volume."
This states that the above people praised the amount of RAM, not the speed, which is the issue.

No, he didnt, you just dont understand what this means, all three are directly linked. If either the bandwidth were a problem or latency too high, he would be unable to move those textures out of that large capacity, into the game, and onto the screen. This directly states there is no bandwidth issue.

"... It’s the same flattering portray for latency, the main recipient of Shin’en’s Manfred Linzner compliments on the system inour exclusive interview."
Let's look at exactly what they said:
"When testing our first code on Wii U we were amazed how much we could throw at it without any slowdowns, at that time we even had zero optimizations. The performance problem of hardware nowadays is not clock speed but ram latency. Fortunately Nintendo took great efforts to ensure developers can really work around that typical bottleneck on Wii U. They put a lot of thought on how CPU, GPU, caches and memory controllers work together to amplify your code speed. For instance, with only some tiny changes we were able to optimize certain heavy load parts of the rendering pipeline to 6x of the original speed, and that was even without using any of the extra cores."
"We didn’t have such problems. The CPU and GPU are a good match. As said before, today’s hardware has bottlenecks with memory throughput when you don’t care about your coding style and data layout. This is true for any hardware and can’t be only cured by throwing more megahertz and cores on it. Fortunately Nintendo made very wise choices for cache layout, ram latency and ram size to work against these pitfalls. Also Nintendo took care that other components like the Wii U GamePad screen streaming, or the built-in camera don’t put a burden on the CPU or GPU."
"Q: On the lauded general organization of the system memory, surely you’re referring in part to the important role of the edram in the Wii U? And about the ram, are other parameters than latency such as bandwidth favorable compare to current platforms?
A: I’m not able to be precise on that but for us as a developer we see everything works perfectly together. Many systems in the past forced the programmers to shift around their data and code quite a lot to fight against latency."

This is pretty much saying that the GPGPU design is the saving grace for the system.

What? This last sentence is nonsense, gpgpu isnt a 'design' for a gpu, its a way of using a gpu.

But thats not even the point. Once again, this directly states bandwidth is not a problem, you just dont have the knowledge base to recognize it. Latency and bandwifth, as stated earlier, are directly linked. Bandwidth is how much can be moved at once, latency is how fast you have access to it, or rather, how long you uave to wait until you can (typically measured in ns or even cycles). If the bandwidth didnt move enough data, then no matter how fast you could access it, it wouldnt be enough, you would have a bottleneck via bandwidth, because your LATENCY would be capable of far more, but your bandwidth would be holding you back.


"In general all those DRAM numbers are not of much importance. Much more for example are the size and speed of caches for each core in the CPU. Because DRAM is always slow compared to caches speed."
Way to go sherlock. Probably not even a developer.

What he said is 100% true. Sherlock. cpu caches blow anything away. They need to be the fastest. Sherlock.

Then the writer goes on about edram, which isn't new and 32MB is the best!

You do realize this does not happen. He reiterates many arguments, many of which are pro wii u, which not his, and he proceeds to shoot many of them down. This just horrible comprehension.

Then reaching at straws with it's new! a memory controller is better when it's new!

Advancements in memory controllers are not imaginary.

Then the writer cuts the 360's bandwidth in half by having their old lame controller be controlling the flow of data like an old old old.

He doesnt cut the bandwidth in half, he shows the numbers being bandied around are out of contect. Its an aggregated rate for read and write, and not actually real world performance as specified when that specific peice of context is lost. And he does it with official 360 documentation. Its a similar situation to how ps2 was said to do a 100 million polygons on paper, while the lowly gamecube could only do 15 million. Except the cube numbers turned out to be realworld, rendered, transformed, multitextured, lit polygons, while the ps2 number.... wasnt, and never, ever came remoteley close to cube numbers let alone the peak theoretical rate


But the Wii U's got chips all over the place connected by seperate buses and memory controllers so it's really super-fast! and new!

This is all made up for effect. Im honestly dissapointed in you.

Followed by baseles speculation about using the ram as a scratchpad.

Thats not baseless speculation. Gamecube and wii had scratchpad memory. (and they arent alone). In fact, it would be rather impossible for wii u to have a wiimode without it.

And baseless speculation about DDR3 > GDDR3 when that's just plain silly. Again using the it's new! excuse.

Id suggest looking at the specifications.

AMD and Nintendo secretly made a secret texture secret that AMD didn't think of as the guys who have been making gpu's for decades and it's only on Nintendo!

The situation as you described it is fictional and was never described as that.

Advancements in texture compression technology arent the dark magic you make it out to be, and have many times over the years been responsible for large performance and effeciency gains. It also wouldnt be the first time nintendo capitalized on it. Gamecube came hardware equipped with a new advanced form of texture compression, s3 texture compression. Sony, on the other hand, did not have s3 until the psp.


I'll ignore the anonymous developer conversation, because of the one they quoted earlier obviously being an idiot.

And of course, you are a subject matter expert.

"we had no issues at all with memory bandwidth on Trine 2: Director’s Cut."
The writer tacks on the implication that what frozenbyte meant was wii u is not bandwidth starved, but we can't say that because of nda's :'( nintendo want you to think it's weak.
What a silly thing to tack on to someones simple statement saying they haven't had problems developing for the system, within its limitations. Like every other console ever, except the ps3's cell.

First off, he spoke to them directly, they specifically answered his bandwidth question, which kills the crap out of your argument.

Second, ps360 DID have issues, notably with bandwidth, from the hundreds of light sources in the differed rendering engine, and many reflections and refractions and, which is why they got gimped versions, while wii u got the pc version.

Followed by more stuff of little value.

Youve done an excellent job so far discerning what information is valuable and what isnt.

I agree with shin'en though, I have faith that Nintendo didn't just grab the cheapest chips they could find and glue them to a cardboard circuit board, the truth on how the ram is managed will be revealed some day.

It already has, in this last nintendo direct, with X. This article was on the money.

Bandwidth is a completely separate thing from how well the memory is organised though.

No, its not, bandwidth is a direct impact consideration when creating a memory orginization.

And, as stated in this article thanks TOO said orginization, the 12Gb/s bandwidth of the 1Gb hynix memory is NOT a bottleneck restricting the ability to stream the large data structures required for large open world games, as anandtech said it was, before X pimpslapped the rainbowe out of them.


This is like, very, very basic.


Ugh, this getting embarrassing. Why do you guys keep doing this, why do you put in the effort to fabricate these responses, but no effort towards the education required to know what this is about?

And the arguments over, the Nintendo direct just proved this guy right.

banner1_zpsb47e46d2.png

 


#47 Nollog

Nollog

    Chain Chomp

  • Banned
  • 776 posts
  • NNID:Nollog
  • Fandom:
    Creepy Stalker Girl

Posted 28 January 2013 - 10:07 AM

No, he didnt, you just dont understand what this means, all three are directly linked. If either the bandwidth were a problem or latency too high, he would be unable to move those textures out of that large capacity, into the game, and onto the screen. This directly states there is no bandwidth issue.

No, that's unrelated to what the developer was stating. He specifically mentions size, not latency or speed.
If something is large, it's just large.
You can compress a large texture to make it small, as the other developer mentioned.

What? This last sentence is nonsense, gpgpu isnt a 'design' for a gpu, its a way of using a gpu.

But thats not even the point. Once again, this directly states bandwidth is not a problem, you just dont have the knowledge base to recognize it. Latency and bandwifth, as stated earlier, are directly linked. Bandwidth is how much can be moved at once, latency is how fast you have access to it, or rather, how long you uave to wait until you can (typically measured in ns or even cycles). If the bandwidth didnt move enough data, then no matter how fast you could access it, it wouldnt be enough, you would have a bottleneck via bandwidth, because your LATENCY would be capable of far more, but your bandwidth would be holding you back.


Latency is how long an instruction is delayed, or takes to complete.
It's basic computing that buses are a limiting factor here, eliminating some of that bottlenecking is where the GPGPU design Iwata was proud of, makes significant increases in performance.


What he said is 100% true. Sherlock. cpu caches blow anything away. They need to be the fastest. Sherlock.

True, but obvious. Hence the sherlock reference, sherlock.
But every cpu in your mom's closest and mine has cache, that doesn't make the Wii U anything special.

You do realize this does not happen. He reiterates many arguments, many of which are pro wii u, which not his, and he proceeds to shoot many of them down. This just horrible comprehension.

No, I comprehended it correctly.

Advancements in memory controllers are not imaginary.

Neither are cache speeds, but they're still not a debunking slow ram speed.


He doesnt cut the bandwidth in half, he shows the numbers being bandied around are out of contect. Its an aggregated rate for read and write, and not actually real world performance as specified when that specific peice of context is lost. And he does it with official 360 documentation. Its a similar situation to how ps2 was said to do a 100 million polygons on paper, while the lowly gamecube could only do 15 million. Except the cube numbers turned out to be realworld, rendered, transformed, multitextured, lit polygons, while the ps2 number.... wasnt, and never, ever came remoteley close to cube numbers let alone the peak theoretical rate

yes he does. He splits the 20-whatever jiggas into in and out, read and write.
Cutting the bandwidth in half in practise, which is why all developers read and write at the same time when they're using ram in games because they're obvious too idiotic to see that could be an issue.

This is all made up for effect. Im honestly dissapointed in you.


k

Thats not baseless speculation. Gamecube and wii had scratchpad memory. (and they arent alone). In fact, it would be rather impossible for wii u to have a wiimode without it.

It's not used as the writer describes. Again they take something and ride it with their imaginary unicorns.

Id suggest looking at the specifications.

Why bother? We have cache and edram!

The situation as you described it is fictional and was never described as that.

Yes it was. The writer clearly says nintendo have a magic texture magic that shaves off 100mb from tokitori 2 based on the developer making a tweet about a hardware feature, which could be anything.
Baseless speculation.

Advancements in texture compression technology arent the dark magic you make it out to be, and have many times over the years been responsible for large performance and effeciency gains. It also wouldnt be the first time nintendo capitalized on it. Gamecube came hardware equipped with a new advanced form of texture compression, s3 texture compression. Sony, on the other hand, did not have s3 until the psp.

And of course, you are a subject matter expert.

I have an understanding of computers, based on many years of study, practical employment and such, but I don't think of myself as an expert.
I think of myself as someone who knows a little more than the average consumer of computing devices.
Perhaps you should do the same?


First off, he spoke to them directly, they specifically answered his bandwidth question, which kills the crap out of your argument.

Second, ps360 DID have issues, notably with bandwidth, from the hundreds of light sources in the differed rendering engine, and many reflections and refractions and, which is why they got gimped versions, while wii u got the pc version.

k, dat new graphics technology eh? something new sure is the bestest.

Youve done an excellent job so far discerning what information is valuable and what isnt.

It already has, in this last nintendo direct, with X. This article was on the money.


No, the x video doesn't prove this article, it merely proves that the wii u is next generation hardware, and the monolith developers are competent.
To claim anything else is plain stupid, I'm sorry.

No, its not, bandwidth is a direct impact consideration when creating a memory orginization.

And, as stated in this article thanks TOO said orginization, the 12Gb/s bandwidth of the 1Gb hynix memory is NOT a bottleneck restricting the ability to stream the large data structures required for large open world games, as anandtech said it was, before X pimpslapped the rainbowe out of them.

This is like, very, very basic.

Yes, but the sum of the whole isn't the individual chip now is it?
Do we know how they've organised their bus flow? That magic new memory controller, what if they sync the data from each ram chip so they flow concurrently, thus making the slow individual chips a non-issue in practise.

Ugh, this getting embarrassing. Why do you guys keep doing this, why do you put in the effort to fabricate these responses, but no effort towards the education required to know what this is about?

And the arguments over, the Nintendo direct just proved this guy right.


Says the gal posting inside quotes so it's harder to reply to her...

Warning: Cannot modify header information - headers already sent by (output started at /home/thewiiu/public_html/ips_kernel/HTMLPurifier/HTMLPurifier/DefinitionCache/Serializer.php:133) in /home/thewiiu/public_html/ips_kernel/classAjax.php on line 328
{"success":1,"post":"\n\n
\n\t\t<\/a>\n\t\t\n\t\n\t\t\n\t\t
\n\t\t\t\n\t\t\t\t


#48 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 28 January 2013 - 10:23 AM

Yes, but the sum of the whole isn't the individual chip now is it?
Do we know how they've organised their bus flow? That magic new memory controller, what if they sync the data from each ram chip so they flow concurrently, thus making the slow individual chips a non-issue in practise.
Says the gal posting inside quotes so it's harder to reply to her...


What? No, no guy, there is no hidden power levels in the main ram, we know everything about them from the nsn number provided on the chip, and we can see how its bussed to the mcm, by literally looking at how its bussed.

And the data already flows from all chips at once man, the chips are sold as a unit, and performance measured as one whole unit. Its not each individual chip, its already a synchronized working whole. That bandwidth number, its not going up. Thats set in stone.

Its simply not an issue because of the memory heiarchy. That bandwidth doesnt go straight to the cpu and gpu. Its a holding pool that the edram and cpu caches dip into (very quickly) to take chunks they can reuse a lot at blistering speeds. Which is what this article is about, and Which X irrevocably proved.

Edited by 3Dude, 28 January 2013 - 10:25 AM.

banner1_zpsb47e46d2.png

 


#49 Nollog

Nollog

    Chain Chomp

  • Banned
  • 776 posts
  • NNID:Nollog
  • Fandom:
    Creepy Stalker Girl

Posted 28 January 2013 - 10:44 AM

What? No, no guy, there is no hidden power levels in the main ram, we know everything about them from the nsn number provided on the chip, and we can see how its bussed to the mcm, by literally looking at how its bussed.

And the data already flows from all chips at once man, the chips are sold as a unit, and performance measured as one whole unit. Its not each individual chip, its already a synchronized working whole. That bandwidth number, its not going up. Thats set in stone.

Its simply not an issue because of the memory heiarchy. That bandwidth doesnt go straight to the cpu and gpu. Its a holding pool that the edram and cpu caches dip into (very quickly) to take chunks they can reuse a lot at blistering speeds. Which is what this article is about, and Which X irrevocably proved.

Well then the wii u's got the worst ram speed in history :'(

Warning: Cannot modify header information - headers already sent by (output started at /home/thewiiu/public_html/ips_kernel/HTMLPurifier/HTMLPurifier/DefinitionCache/Serializer.php:133) in /home/thewiiu/public_html/ips_kernel/classAjax.php on line 328
{"success":1,"post":"\n\n
\n\t\t<\/a>\n\t\t\n\t\n\t\t\n\t\t
\n\t\t\t\n\t\t\t\t


#50 Alex Atkin UK

Alex Atkin UK

    Boo

  • Members
  • 528 posts

Posted 28 January 2013 - 12:11 PM

Riiiight. The CPU and GPU are able to rely on fast cache memory better than ever before, thus not needing a huge increase in [main] memory bandwidth over previous consoles, but its the END OF THE WORLD!

Geez, if it was half as bad as you seem to be making out - how on earth did they get any ports working in the first place?

Edited by Alex Atkin UK, 28 January 2013 - 12:11 PM.

Sheffield 3DS | Steam & XBOX: Alex Atkin UK | PSN & WiiU: AlexAtkinUK

 

How to improve the Wii U download speed.


#51 golf1410

golf1410

    Paragoomba

  • Members
  • 23 posts

Posted 28 January 2013 - 03:41 PM

Probably will just like last generation....>_>
I don't understand what more they can do in terms of graphics....it would be awesome if we have in-game graphics comparable to movies.


They can not do much on graphic as all tv displays have to improve first.

#52 Dusean17

Dusean17

    Blooper

  • Members
  • 184 posts
  • Fandom:
    PATAPON | LBP | DIGIMON | ZELDA | KIRBY

Posted 28 January 2013 - 05:39 PM

They can not do much on graphic as all tv displays have to improve first.


I don't think that's entirely true....

PlayStationUnityampNintendoUnitySingleSi

To visit my Nintendo Channel click the link above, to visit my PlayStation Channel click here.


#53 Chaz

Chaz

    Red Koopa Troopa

  • Members
  • 50 posts
  • Fandom:
    Pikmin, Animal Crossing

Posted 28 January 2013 - 10:41 PM

Yeah yeah...EDRAM is blazing fast and on-die is even more effeicient that the Xbox's configuration. None of this is news and the reason I called out this article is because it offers nothing new and worse yet, nothing in is is really substantiated. For the record, I never suggested that anything written here wasn't true and I don't throw my hat in the ring with all the people claiming the WiiU is crippled... I just think this article is fluff and fanboy rhetoric.

No, they arent. The only devs who have spoken about the memory all state their is no bandwidth issue.


That is textbook anecdotal evidence. Should we also blindy accept that the WiiU's CPU is "horrible and slow" based on the experiences of Metro and DICE?

Just because these two develporers had no problems doesn't mean it isn't a problem for others, especially considering that neither of those two engines is doing any heavy streaming of assets; Frozenbyte claims that the WiiU had more memory than they could use, and that they were simply loading everything into memory each level and keeping it there. You could make the argument that larger, open-world game developers might take the opposite view based on their games' framrate issues, or low-resolution textures ( in the case of ZombieU ).

That's what I mean by context. Out of context, everything in this article is entirely theoretical and might have little basis in reality.

Oh, yay, ad hominem. Id actually suggest college, cant exactly simply jump in to 'homebrew' when you cant actually do anything.


No thanks, I make a decent living doing visual effects for movies and games, so to not get paid to diddle around with a game console would be a rather large step backwards. Besides which, homebrew development is like like this article: all theory and no practice. For you, I'd actually suggest maybe getting some more fiber in your diet. I think it could do wonders for your over-dramatic demeanor...think it over. Maybe switch out your brand of tampons while you're at it...you definitely appear to have a blockage somewhere.

But i have some bad news for those individuals, time has already proven this article correct.


Well, not really...too be fair, that game isn't even out yet. For all we know, that might be running on a PC in that trailer :)

#54 3Dude

3Dude

    Whomp

  • Section Mods
  • 5,482 posts

Posted 29 January 2013 - 05:04 AM

I just think this article is fluff and fanboy rhetoric.

Except this article systematically shoots down a large amount of fanboy rhetoric.

That is textbook anecdotal evidence. Should we also blindy accept that the WiiU's CPU is "horrible and slow" based on the experiences of Metro and DICE?

If you paid attention to context you would have found out metro never actually worked on wii u, and made the comment after a breif glance at v1 devkits, they later explained this and apologized. Likely after getting orbis/durango devkits @1.6 ghz and realizing they were about to look like completely incompetent morons if they didnt retract a clockspeed kneejerk before those specs leaked. Dice never touched wii u hardware and simply commented on the metro devs comment.

You have.... seriously damaged your credibility here....

Just because these two develporers had no problems doesn't mean it isn't a problem for others, especially considering that neither of those two engines is doing any heavy streaming of assets; Frozenbyte claims that the WiiU had more memory than they could use, and that they were simply loading everything into memory each level and keeping it there.

That was Shin en.

You could make the argument that larger, open-world game developers might take the opposite view based on their games' framrate issues, or low-resolution textures ( in the case of ZombieU ).

You dont understand youve already lost this point do you? Did you not see the nintendo direct? Skyrim is microscopic now.

That's what I mean by context. Out of context, everything in this article is entirely theoretical and might have little basis in reality.

No, you just dont have the knowledge base to use it, so you disregaurd it. Thats a logical fallacy.

No thanks, I make a decent living doing visual effects for movies and games, so to not get paid to diddle around with a game console would be a rather large step backwards.

Yeah, im familiar with asset artists, and you act just like one. Your subject matter knowledge doesnt apply here. Or anywhere really. You're an artist, not a programmer.

Besides which, homebrew development is like like this article: all theory and no practice. For you, I'd actually suggest maybe getting some more fiber in your diet. I think it could do wonders for your over-dramatic demeanor...think it over.

Homebrew is our hobby, not our job, and lots of people have enjoyed our work, making your all theory no application remark completely idiotic. And you have just confirmed to everyone you are a complete moron.

Maybe switch out your brand of tampons while you're at it...you definitely appear to have a blockage somewhere.

oh joy, more ad hominem.

Well, not really...too be fair, that game isn't even out yet. For all we know, that might be running on a PC in that trailer

No, its not, why would a first party nintendo game be developed on a pc instead of devkits?


Whelp, there goes all your credibility.

Edited by 3Dude, 29 January 2013 - 05:06 AM.

banner1_zpsb47e46d2.png

 





1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users

Anti-Spam Bots!