Jump to content


Nintyfan86's Content

There have been 240 items by Nintyfan86 (Search limited from 15-June 20)


By content type

See this member's


Sort by                Order  

#214052 Is the wii u THAT underpowered?

Posted by Nintyfan86 on 31 May 2013 - 06:32 PM in Wii U Hardware

The GPU will not be a 7970, for starters, at the very most we're looking at a 7770.  Sony is big on putting out numbers that tend to not be true in real life situations, so I wouldn't even count on that at this point.  They claimed RSX had 1.8TFLOPS as well.

 

You are right in assuming that they'll run into hardware bottlenecks long before memory bottlenecks, where those hardware bottlenecks are is anyone's guess at this point.

 

The reason MS went with a lower spec, and even Nintendo went with a lower spec is that much of that additional power will be wasted.  PS4 having more power will not show on screen due to the diminishing returns we're seeing in the GPU space, and why GPU makers just add more shaders linearly along with more RAM for their more powerful cards.  Those cards are MUCH more power hungry, run MUCH hotter, yet produce maybe a few more frames than a midrange or budget card.  The benefit of too much power in a closed system like this is limited.  You have to find a balance that works.

 

Exactly, Microsoft is an OS company, and that shows through brilliantly on the XBONE.

 

Nintendo is a game company, OS's and online infrastructures aren't in their natural wheelhouse, though I'm sure they're learning very quickly to make it their wheelhouse. 

 

That is what perplexed me about the PS4 and GAF proclaiming this 7970m theory. Of course, the rumors of both machines using 7770 variants were out there for a long time before this, which had me believing that the Wii U would be a no brainer with regard to multi plats. However, GAF, as you suggested earlier, appears to be something like Gamefaqs for wanna be tech heads, with a microscopic percentage of actual knowledgeable people. 

 

 

Your GPU comment appears accurate, especially with AMD's road map, and Nvidia's release. We are not really seeing quantum leaps anymore. Only newer, slightly more efficient ways to do the same thing slightly faster. The power draw may drop slightly on some cards, yet increase on others. It depends on whether or not it is a rebrand or an actual reconfiguration. In either event, it is as though a new GPU card in XFire or SLI is good for 3 years before there is even a reason to upgrade. 

I'm going to go ahead and bring up the argument I used against a friend. We are not at the point of marginal returns. Console makers make their consoles less powerful so as not to lose so much money and then want you to believe that. Until we are at a molecular level in games, there will always be room for improvement.

But, the Tflops and cores! Honestly, the scary thing about the path of increased graphics on a linear scale will be production costs, and an inevitable situation where you get a few clones of 3 genres. To EA's credit, they tried some new things with Mirror's Edge, Dead Space, and a few others in 2009, but that was probably not the best time to do it. 

 

If anything, those IOS games and Nintendo's Indie initiative is probably a very good thing. Looking at tech as a prop is, quite possibly, the best thing the industry can do. After all, when you watch a movie, do you go on a forum and argue about the technicalities of a movies' special effects vs another? Games and systems should, hopefully, reach that same standard. Art. 

 

 

 

 

First question: OS, why was it in such a bad shape? Honestly, Nintendo sucks at OS creation. The Wii was vwry basic, and the Wii U was an upgrade to it's design. I think they should use Linux, but they haven't listened yet.

As for the developer tools, Nintendo had no excuse. I don't know why they didn't give proper help to the developers. They learned from that mistake, but it was too late.

Hopefully that helps.

After seeing routerbad's comments, I am not sure MS went with a lower spec. If the machine is a 7770 with the same CPU, and DDR5 memory vs embedded ram, a similar gpu/cpu setup, and the same DDR3, I am no so sure as to how the PS4 has a real advantage. It would be like giving a budget card 8 gigs of ram when it could not possibly use it. I do agree about diminishing returns. Sitting on a couch, 4-10 feet away, and with a controller will limit the need for a constant 60 fps. 

 

I can see the potential with the OS. I really want to see the whole thing get as fast as it is when going from in game to the browser. I think this is possible. Do you? 




#211537 Is the wii u THAT underpowered?

Posted by Nintyfan86 on 26 May 2013 - 06:23 PM in Wii U Hardware

I believe it is reasonable to expect a lack of support as the main concern over and beyond power. Other than that, my impression is this:

 

Ultra Hd

High-End PC-Very High

Mid Range PC-High

 

1080p

PS4-Medium

Xbone-low to medium

 

720p

Wii U- Low to Medium

 

This is, say, 2 years from now. Of course, we need sales and third parties in order to prove any kind of theory. The current 'toss on something old since those Wii owners don't have other consoles' thing is not helping. However, anyone with a PC now can try bumping graphics up and down, with a controller in hand, and seeing how large of a difference it makes to the game experience. It just isn't the big deal it used to be.

 

So, underpowered, compared to what? Certainly not to the extent Wii was incompatible with the current generation. 

 

 

 

 




#213612 Is the wii u THAT underpowered?

Posted by Nintyfan86 on 30 May 2013 - 09:58 PM in Wii U Hardware

The original prototypes and the software environment was built around weaker hardware than ended up in the system, that is true.

 

Because it took so long to get the hardware finalized, the OS itself was not optimized for it by the time they were preparing for launch.  This is a big reason behind the utter lack of marketing, Nintendo are sandbagging the Wii U intentionally (they know early adopters will buy into it anyway, they just thought there would be more) because they themselves felt the software environment was not ready.

 

The real problem is that the GPU is so very divergent from the original target hardware, that all of the tools would have had to be rewritten both to take advantage of the GPU at a basic level, and to properly use the gamepad.  For the gamepad they needed to essentially upgrade the target hardware in order to get the performance they were looking for with the gamepad, and AMD tech like eyefinity fits perfectly.

 

It isn't just a customized 6760, its the other way around, in fact.  Its a custom chip that happens to utilize the AMD unified shader cores from the 6760 that they licensed.  It's a brand new animal that no one was prepared for.

 

I assume they still haven't nailed down the best clock speeds, and the cache on die was based on their budget for the chip.  They really thought of everything, and companies like Shin'en that really understand GPU tech through and through are able to be authoritative on the fact that there is plenty of power there if you know how to use the system properly, and everything is designed for extreme efficiency.  No clock cycles wasted, as it were.

 

The great thing is that many of us are already satisfied with how the system basically operates, and that says something, because Nintendo isn't, and they will continue to improve all aspects.

I see, so it is not as simple as releasing specs for software, and making the hardware engineers work around it (this was also playing into my thoughts, like how all different parts will work with Windows regardless of architecture changes within the x86-64 spectrum to a point). Rapping your head around a closed system is very difficult. It is like capital budgeting, but locking yourself in without any alterations, for years. I never considered it from this angle. 

 

The different levels of memory cache are interesting as well. What is even more pertinent is that Microsoft is using a similar strategy, yet Sony is going with the GDDR5. I have read, but have not fully understood, the concerns on Sony's approach. I also have trouble understanding how 8 Jaguar cores and a 7970m are going to merge into an APU. 

 

One aspect with Sony's system that seems to be apparent is that the power will be exploitable from day one, presuming a game exists that can take advantage of it. However, their approach seems unbalanced to me. I do not know how that cpu/gpu combo will be able to use half of that ram without running into bottlenecks of it's own. Not to mention the obvious reserves for the OS (which I suspect some of those cores/gpu compute unites to be reserved for, and why MS is lower spec in comparison for undoubted OS efficiency vs Sony). 




#213605 Is the wii u THAT underpowered?

Posted by Nintyfan86 on 30 May 2013 - 08:57 PM in Wii U Hardware

No problem at all ^_^.

I agree with what routerbad said completely. Just want to add, that, it's good all the tech wasn't done by E3 2011 in a sense. If it was, that would mean the Wii U in video card generations would be two generations behind instead of just one. And that could end up making a difference in graphics that people think exists today.

Although developers I am sure developers hated that.

 

 

No problem at at all.  

 

The real reason why they didn't have final hardware much earlier will probably never be known.  Technically, none of the components existed until Nintendo and IBM developed them.  They are based on previous architectures, like Power7, PPC, and AMD Unified Shaders, but the final product is anything but.  

 

The target hardware for the wii u early on was rumored to be something like a high end HD4XXX series GPU, with a pure Power7 CPU.  Because of all of the difficult R&D that needed to be done to develop a wireless video standard with broadcom that worked with zero lag and no tearing, artifacts, etc, the GPU had to be designed at the same time to work flawlessly with that standard.  My guess is that it took them longer to develop the tech behind the gamepad than anything else, and that the GPU was finalized once they were settled on that.

 

Also, as 3Dude pointed out in another thread, they wanted eyefinity, which only debuted with very high end 5XXX series GPU's, and made more broadly available for HD6XXX.

This may help others reading, and I know it will help my general understanding:

 

My impression of console development is that it gets finalized at some point. The foundation, so to speak, is there. In this case, Nintendo, AMD, and IBM know they are going with a GPU, most likely custom from the 6 series (2010), and the tri-core Power 7/750 base chip. Leaving the gamepad out for a moment, I would presume the only thing left for the box itself would be the clock speeds, amount of cache, and so forth. Or, put another way, that initial dev kit, with the 4850 inside, it should have been the target for the OS. 

 

As they got closer with the gamepad and finalized specs, I would have thought that the OS would have been patched along the way. 

 

What I am having trouble understanding is how the OS was in such a state at launch, and how the dev tools were to the point where launch games were using 2 cores. 

 

Note: I am also under the impression that the MCM is simply a customized e6760 and IBM  solution to fill in the gaps. Without a real target game from Nintendo to show off technical capabilities, it is hard for someone not really tech savvy (like myself) to rationalize the issues they have had outside of the gamepad development (which is an amazing achievement along with the low power consumption). 




#213094 Is the wii u THAT underpowered?

Posted by Nintyfan86 on 29 May 2013 - 08:06 PM in Wii U Hardware

Guys,

 

I honestly believe the following:

 

  1. Nintendo's exit with the Wii hampered Wii U excitement.
  2. Confusion over the controller compounded with #1 has a lot to do with less enthusiasm. 
  3. Nintendo honestly believed the gamepad would be the next Wii Controller with consumer response. 
  4. Item #3 hampered focus from graphic fidelity to gamepad integration and focus. Hence the lack of tools (possibly one, non-intentional reason, but this could explain why things were so poor at launch).
  5. Finally, items 3&4 led third parties to believe functioning software, with gamepad integration, would provide a decent ROI. Given the low install base and probable attachment ratio, the investment did not need to be too high to begin with.  

Now, this is really item 6, and this deserves it's own paragraph. Publishers have watched year over year decreases in game sales. They anticipate the PS4 and Xbone to reverse this trend. Hence forth, they are spending as little as possible to bring ports to a new market (Wii U), while holding back until the 'big boys' launch. At this point, the Wii U's install base is bigger, and they know that one of two things will happen:

 

  1. The new consoles will face shortages, which will increase Wii U sales.
  2. The Wii U install base will grow.

Why post this in a thread about the Wii U's technical prowess? I think everyone can see why, but if not, allow me to conclude:

 

Ports from the Xbone are coming. The architectures are too similar, and the costs of developing for the PS4/Xbone will be too great to not have the game on every platform possible (these new games will not be like the PC games of console ports on very high, but rather, new, more ambitious titles).  The game market will not explode to the levels we saw in 2007-2009. The tablet did not exist, and the overall gaming options that you can get, for free, on PC were not as sophisticated (for casuals). Plus, the market is always changing, as people age and demographics shift (my 7 year old wanted an Ipad Mini over a Wii U:(). 

 

The point? This is a business. Money is the final arbiter. If I have this figured out, you can bet Nintendo does too.




#213144 Is the wii u THAT underpowered?

Posted by Nintyfan86 on 29 May 2013 - 10:07 PM in Wii U Hardware

They didn't think that the Wii U controller would cause the same amount of excitement as the Wii did, but they were under heavy pressure by investors to release new hardware, and fast, because sales of the Wii were declining rapidly.
 
Now, they had a general idea of where they would like to go with it hardware wise, but they weren't finished revising the hardware until mere months before launch.  As a matter of fact, it was the week of e3 2012.  This left them with little time for optimizing and revising the OS, stress testing and getting a solid feel for the hardware and how it performed in real world operations (theoreticals don't really help with the actual kit), or get decent tools out to developers who needed them to work on games with very little time to push to market.


I don't know, none of it adds up. Having a content filled 2010, followed by killing the Wii in 2011 with very little support, and expecting a half baked successor to fill the gap? I would think the board of directors would not approve.

However, they were not expecting the Wii to be a hit, and they had the HD console on stand by in case it flopped. Yet it was a huge success.

With what your saying, the only logical business strategy is to bank on the Wii brand name to stimulate early adoption. It seems as though it is the logical choice given the parameters.

But...on the technical end, it was not as simple as releasing a machine that played current ports better. Instead they had to optimize for games 2 years out and beyond.

So, I am left trying to reverse engineer their business strategy. The machine,s power is not up for debate until we see actual next gen titles on it IMO.



#213353 Is the wii u THAT underpowered?

Posted by Nintyfan86 on 30 May 2013 - 01:18 PM in Wii U Hardware

Third party publishers like Activision expect the same thing to happen with the other consoles.

No, they expected the Wii to be a success. They launched the DS as a test for the causal market. When that sold, the HD console you speak of ceased development and they finished the Wii concept.
That's why the DS was called the third pillar, and they said the Game Boy brand wasn't dead. You seen any new Game Boys lately? (still hoping for a return someday lol)
They didn't kill the Wii intentionally, it just ended up doing that. Nintendo made a lot of mistakes, but the graphics didn't suffer because of gamepad development. The only thing that hindered graphics is it was released in 2012. If it was released in 2013, it would be on the same level as the others and we wouldn't be having this conversation.
However, Nintendo released in 2012. And you know what? The difference between these three consoles, aren't that huge if a gap like Wii vs PS3 & 360.

I forgot about the third pillar, lol. I figured then that it was a fancy way of saying transitioning into a different demographic before expanding the product across the entire demographic. Including GBA BC made me think this way.

I did not know they expected immediate success with the Wii. I presumed they expected to expand into the blue ocean while offering the established market something the competition was not. Hence all of those Wii's gathering dust next to the HD Twins awaiting exclusives.

So,launching in 2012 makes sense with regards to the global economic recovery, yet 2011 fit better with Nintendo's product cycle.

What I have problems understanding, from an R&D standpoint, is how the architecture could not be finalized by E3 2011, with dev tools and software being developed from then on? I believe everything in it existed at this point, may be wrong. Clock speed would be all that remained.

I appreciate routerbad and yourself for being so generous with your time in regards to discussing these concepts.



#234593 Shin'en Explains Wii U EDRAM Usage

Posted by Nintyfan86 on 24 July 2013 - 07:37 PM in Wii U Hardware

Yeah you go around the internet and most gamers praise ubi for supporting Wii U. I'm happy they are bringing all their heavy hitters this year also. I'm still not an idiot though and I can tell when a game isn't being promoted. Splinter cell just isn't being promoted enough and ubi isn't saying much about the console other than pr damage control. Saying stuff like oh never count Nintendo out and Wii U is going to have a big cjhristmas. I take those as PR just so they can show if the console turns around look we were supporting Nintendo the whole time yet you aren't talking up the Wii U version of splinter cell and how it "should" be an overall better experience then what can be found on last gen consoles. The Wii U is next gen let's face it and I hope publishers and developers start acting like it.

No doubt. However, with limited resources, the publishers see 3 things:

 

1. The PS4/One Launch will have a drought just large enough for early adopters with larger wallets to buy the software in the wings. 

 

2. The PS3/360 has a massive user base of fans that will buy those games. 

 

3. The Wii U will have a new 'monster' game each month to compete with, and those games will not really attract an install base that will either buy third party games en mass, or buy them on the Nintendo platform (if they are multi-platform owners). 

 

So, we know it will not be as simple as cutting/pasting from the dev kits, but, the PS/XBOX versions most likely have a much higher NPV that is more consistent with Ubisoft's required return. Tie this into their overall WACC and the reinvestment rates (MIRR),and this explains why they are placing their bets (and capital) on the other platforms. This is an extreme summary, but, as a consumer, I feel insulted. 

 

I will feel insulted as a PC consumer when the game takes a while to work properly as well, LOL. 




#234556 Shin'en Explains Wii U EDRAM Usage

Posted by Nintyfan86 on 24 July 2013 - 03:02 PM in Wii U Hardware

we will see but with what i know the Wii u is capable of and seeing that Bayonetta 2 demo from E3... its just hard for me to go out and spend 60 dollars on a Wii U game that is barely up to 360 standards. Shin'en has gone on record multiple times nowstating how to get the most out of the console and if Ubi is having problems they are big enough to go or call up nintendo techs and say help us get the game running smooth on Wii U.

 

I agree. I have been doing a lot of PC gaming lately, and, honestly, consoles are the platform of choice for exclusives at the end of the day. Whether that is exclusive features or IPs, they should not be purchased for power IMO. 

 

My decision from this experience was to simply buy my third party titles on PC from now on, and the exclusives on consoles. I hate having to do this, but the third party games outside of Rayman are too much of a focus for me to risk playing 360 version of for full price (when I can use my rig to play beyond PS4 standards for the same games, at the same price).  

 

Something is truly fishy here, especially considering that the Ubisoft games for Wii U are getting NO screen time outside of Rayman and Splinter Cell. 




#234841 Shin'en Explains Wii U EDRAM Usage

Posted by Nintyfan86 on 25 July 2013 - 02:16 PM in Wii U Hardware

The gigantic ram banks for the shader units and other logic units would suggest the wii u gpu could get a lot more per flop than conventional architectures. All that ram around those logic units would sure give those flops a lot of bandwidth. There are many things that would get done much faster using double precision.....

The amount of work that can be done per flop isnt the same between all architectures and well.... between flops.

More flops= always more power is a common myth, like the mhz myth, that is currently being exploited heavily in marketing.

For instance, ps3's Cell 'theoretically' gets 240 gflops. Or is it 100? Well, Actually, its both. Cell gets 240 single precision gflops, but 'only' 100 double precision gflops (one of the main bottlenecks of dp is local ram bandwidth). Guess which one of those gets more per flop?

Thats why you see a lot of marketing around high gflops... But then sometimes realworld performance is lacking, and you find out the benchmark only used a small single precision loop that made the architecture look really good in a benchmark... But isnt indictive of real world performance.

Flops per cycle is only a small peice of a puzzle.

I have a question for you involving https://twitter.com/...277106856370176 . 

 

I am seeing a lot of, "Sony's first party games will be amazing", and so forth, but, the idea of cross porting to the XBone, Wii U, and PC would become an interesting gambit, presuming the PC becomes the lead platform again with the next gen architecture taken into account. 

 

Do you think this will still hold true? Are we looking at a PS4 that could replicate an 8350 with a gpu within throwing distance of a 7970 at this point (3.68 vs 3.8 Tflops)? This would also double the Wii U's on paper specs obviously, while keeping all of the consoles in the same ball park. I know Tflops are a small factor in comparing GPU's, but, they are presumably from the same series.

 

Am I completely misunderstanding this?  




#235272 Shin'en Explains Wii U EDRAM Usage

Posted by Nintyfan86 on 27 July 2013 - 11:25 AM in Wii U Hardware

All programs that can be split up into seperate parallel jobs will greatly benefit. Texturing, and shaders for example, fit very well into this group. So do 'psuedo physics' Like the old havok stuff popular on ps360, where everything, no matter size or weight flies around like a cardboard box.

These greatly benefit from more threads and cores, however, once you add so many, the overhead of keeping all these cores and threads synchronized and properly communicating will begin to erode the performance gains.... That is, if you are attempting to use it all on just one process, or application, like say a videogame.

For multitasking, doing something completely different at the same time, like processing who from your list is online, recieving messages, downloading a show, recording gameplay, mantaining other operations running so they can instantly be switched too....

More cores are great, and to that end, its likely why there are so many.

However for many general purposes tasks, like game code, ai, and things that arent easily predictable, or simply 'going through the motions'. Well, those can only be handled sequentially, so no number of cores will help speed them up.

Only powerful single thread/core performance will help.

Dice created frostbite to.... Completely avoid this as the ps360 sucked at single thread performance and excelled at paralellism.

It made sense, as battlefield was multiplayer only and no real need of ai or heavily structured game code at all.

It appears the rise of moronic bubblegame cinematic linear roller coaster games gave them the confidence to make bf a single player campaign no one wants. A very linear on the rails experience (not implying its an on rails game, but simply the same stuff will happen the same way, everytime you activate the event). But super cinemarrific. The kind of thing you dont really need strong general purpose processing for. Which is why dice says they no longer even need a real cpu.

I appreciate your thorough response, and as always, your time in answering the question. 

 

I can understand how BF and COD are examples of the rollercoaster linear games, however, they all seem like that from some perspective. However, the possibilities become expanded. I just finished Far Cry 3 on PC. The same events would occur over and over, or at random. Like, cause and effect based on what you did. Obviously not scripted like the former examples, but, I am seeing it as more like, "event A activates event B if parameter 3 is met". This is what represents general game code, and thus falls within diminished returns from additional cores (I guess this is why they do not bother with hyperthreading)?

 

This is such great information for someone that has been mislead through benchmarks and 'teh optimizations' for PC hardware. 




#234916 Shin'en Explains Wii U EDRAM Usage

Posted by Nintyfan86 on 25 July 2013 - 07:56 PM in Wii U Hardware

Its true. Optimizations go a very long ways, and its something that really cant be done as well on pc because everybodies is different, but the game has to run on as many pc's as possible, from uber powerful to weaksauce.

But, I wouldnt expect many devs to focus on making a game that uses 3+tflops as mandatory... For the reasons i stated above. PC games need to play on as many pc's as possible. Its why they are built around a minimim specs system.

What does not make sense though is that 'they' are claiming the consoles are targeting 30 fps/1080p. If their true 8 core Jag chips are 1.6 gig, and, in the PS4's case, the 'almost' 7950-7970 GPU with 100% optimization, I would guess it would be like having a 3.2 ghz  Piledriver chip, with 7ghz of unified memory for applications (1 for the OS). 

 

So, the minimum spec PC games would, theoretically, be able to run on high/very high for a very long long time considering the required resources to make these DX 11 titles, the 1080p lock, and the supposed 30fps cap (which does not make sense for all games). 

 

I am very confused on how the PC community is brushing the PS4 off as being so out dated, when, at 1080p, there is not much out there that it should not be able to max outside of a Windows environment. My suspicion is that it will take a while until these reach minimum specs at 1080p for the new consoles to become as outdated as they appear, yet I do not think we will see anywhere near the optimization we saw with the PS3/Xbox360 vs the PC. 




#235218 Shin'en Explains Wii U EDRAM Usage

Posted by Nintyfan86 on 27 July 2013 - 09:21 AM in Wii U Hardware

Dont dwell to much on what 'they' say. Time will tell.

That being said, google 'Amdahls law'. More cores can only get you so far, you still need powerful single cores and thread performance for many operations important to what videogames do.

OK, I have read enough about Amdahl's Law to potentially butcher a follow up a question (scratch potentially, insert probably).

 

If we take parallel processing, or the fact that there will be diminishing returns, or rather, some operations that will take X amount of time to complete despite the time saved on those tasks completed in parallel, is it safe to suggest that general code will generally face diminishing returns with added cores?

 

I cite the Battlefield 3 example, and the use of a Bulldozer or Sandy/Ivy Bridge i5/7 to play the campaign, and then moving on to multi-player. Given what Dice has shown about the engine, the campaign could care less until you get down to 1 core, yet, multiplayer generally does better on the I5 over the Bulldozer (or Piledriver, etc.). 

 

My follow up question is now this:

 

Since Amdahl's Law will always be in effect, will there always be that one operation that brings diminishing returns to an increased core count? I realize how ridiculous this sounds, given the definition of a law, but is there a comparable example of a 1 core vs 2 core situation, or otherwise, where a program benefited from more cores/hardware-software hyperthreading?




#222268 PS4 launch=Wii U price cut?

Posted by Nintyfan86 on 15 June 2013 - 10:49 AM in Wii U Hardware

You can buy a 500GB HD for 50$ on Amazon.  I'm not using the Deluxe as an example, the reason why its 350 is because it comes with Nintendo Land. So in reality it doesn't close the gap, cause I'm using the basic model. 

Plus, your camera is included;).




#231081 Lets analyze and compare Bayonetta 2' gameplay

Posted by Nintyfan86 on 12 July 2013 - 11:41 AM in Wii U Games and Software

I played an Xbox last weekend- people who make the claim that Wii U is equal to/competitive with or just above Xbox/PS3 are either outright lying, or haven't played a Wii U.  Xbox is sharp- Wii U is clearly a step above.

I cannot find one PS360 game with the visual fidelity of that dragon battle. The game is ridiculous, especially when you consider 60 fps.

 

This is a day one purchase for me.  




#225132 The Wii U should have been as powerful as the Xbox one.

Posted by Nintyfan86 on 21 June 2013 - 04:54 PM in Wii U Hardware

Gameplay wise it wont matter what u have. Graphics will be different but still overall good allround. So only comes down to who has what + installbase. Graphics will matter most to the early guys

It is always a generational thing, where games are simply not possible on the previous platforms from a gameplay perspective. Look at Wii vs PS360. 

 

Now, we have this situation where the games, in motion, are not really giving this drastically different experience. If you play Battlefield 3 or COD 3, you are not really looking around and comparing graphics (if you are, the game is too easy to be immerse the player). 

 

This is more general banter not directed at you Tboss:

 

On the Wii U, we know, on paper, we have 3 PPC cores w/ lots of cache, a lower clock, a highly custom GPU with edram, and a gig of memory dedicated to memory. On the PS4 front, we, again, have 8 mobile AMD cores, god only knows what exact GCN gpu (I suspect the 6670 rebrand), and 7 gigs of gddr 5. The XBONE, which recently dropped the controversial DRM measures, is in between, yet has a similar architecture to the Wii U on the GPU front, and a closer, if not identical cpu relationship with the PS4. 

 

Next year, when we have a similar install base for each, why does anyone think the Wii U will not handle versions of the PS4 or Xbone games? You do realize this is like a scalable PC spec generation, right? ( 




#222408 The Wii U should have been as powerful as the Xbox one.

Posted by Nintyfan86 on 15 June 2013 - 05:08 PM in Wii U Hardware

Socialmuscle brings up a great point. If the tools are there, the PS4 is more or less an x86 machine with a different OS. Unless they handicap devs, they would theoretically have access to virtually everything the console can do, outside of optimizing for the specific CPU and GPU. The Wii U is on that traditional level of yearly improvements. I wonder which one will get maxed first?



#222265 The Wii U should have been as powerful as the Xbox one.

Posted by Nintyfan86 on 15 June 2013 - 10:45 AM in Wii U Hardware

I'm not so sure it would have mattered. Two important patterns we see in gaming history; one is that the most powerful console is never the winner. Master System was more potent than the NES. SNES had a bit more juice than the Genesis (I know that's a controversial one in the "who won" arguments department. Most people would say its a tie, having reviewed the Genesis and its library extensively I argue Genesis won in sales by a few hairs) PS3 trumped 360 and wii in tech but couldn't overcome in sales and is probably the best example of this. Secondly, perception is everything, tech means little, when it comes to numbers. Sadly Nintendo is such a polarized issue most people with a passionate stance one way or the other aren't going to be swayed by tech, the brand has a fixed perception for everyone good or ill. If Wii U had been the most powerful of the 3 I think it would have done nothing but hurt them. On top of all the stigma people who swear by Sony or Microsoft give Nintendo they would now add to it the wasteful price point for technology that will never be fully exploited.

 

On a side note, I've seen a couple people say the wii u should have had a normal controller. I can see a legitimate complaint in that it will be a hard controller to replace.... Am I the only one though for which the controller was actually the selling point? The wii u controller, weird as that may seem, is my killer app. I love it. And I think its a matter of perception maybe we just aren't ready for it. To understand the wii u one must embrace the concept that the controller is just as much a part of the machine as anything else. It IS the console. Just curious though if the controller was something anyone else got excited about or if I am just nuts.

Very valid points! I could not agree more with the factual analysis in your first paragraph. 

 

Your side note-no, your not the only one. I have been PC gaming with a little PS3/Wii on the side since 2010. Before that, I also had a 360. I love FPS/3rdPS, but, the games begin to run together, and you need something in the middle. The PC has some of that, and the Playstation Universe gives you variety, but you are, more or less, going to get better versions of similar concepts. It just gets tiresome. 

 

The Wii U controller, combined with the Wii Controls (something Nintendo has not really exploited-where is that golf thing from E3 2011?) is a really cool concept.  Don't get me wrong, I would have bought the console if it had an update Gamecube controller. However, the gamepad is pretty awesome. 

 

I wonder if we would be having this discussion if the games looked like this at launch? Plus, has anyone noticed you can still preorder the machines from Sony and Microsoft online, and get them this year? If you are old like me (27), you remember having to camp out to get a preorder, and those selling it online selling out within minutes. I wonder what's up? 




#223158 The Wii U should have been as powerful as the Xbox one.

Posted by Nintyfan86 on 17 June 2013 - 05:52 PM in Wii U Hardware

They would have priced it right around $399.  I have no doubt they are still taking a loss on the $500 price point.

Do you think they are receiving a higher licensing fee for their DRM model? This could subsidize the loss somewhat if the attach-rates hold.  

 

If they would promise backward compatibility in the future, I would be willing to get one (and if I could disconnect Kinect).

 

As it stands now, you can still get a console this year. This is very different from previous launches. I think those previous forecasts showing PC gaming growing while console market share shrunk may have had a point. Younger people are all tablets. Us old farts need buttons.  




#222459 The Wii U should have been as powerful as the Xbox one.

Posted by Nintyfan86 on 15 June 2013 - 07:40 PM in Wii U Hardware

How can you really compare the two in either case? Mario Kart vs Killzone? C'mon man! LOL. 

 

You really need, like, the same game, from the same people, optimized for each system, to see clear differences.

 

I don't know why anyone feels the need to prove the PS4's superiority. No one ever said it wasn't. Heck, I think every one here wants one. 

 

However, you cannot go on specs alone, as there will always be the PC around to crush it in that department. 




#229984 What if in a few years Nintendo made a Wii U RAM Expansion Pak?

Posted by Nintyfan86 on 08 July 2013 - 05:05 PM in Wii U Hardware

They do it because it's cheaper, and because not everyone upgrades so developers still have to keep the un-upgraded consoles in mind, effectively addind more difficulty to developement.

And have you ever bought an add-on module for the Gamecube?

Only to play GBA games on the TV ;).

 

In all seriousness, the only add on module model that made sense was the SNES chip on a cart model. Even that was bad, since it would increase the cost of the games in some cases. 

 

To answer the original question, an upgrade-able console, or a PC-like device for dummies (so to speak), would be a disaster. Games would be fine I guess, as you would have a low spec, but you would only be able to make so many parts interchangeable.

 

You would need to solve a problem of making it possible for everyone to upgrade it without much user error. I guess, since it is all embedded, they could put the system on a tray, or in one spot, but in that sense, you are just keeping the shell. Otherwise, you have all of the problems already mentioned, and experienced first hand, by Sega.  




#226231 What if in a few years Nintendo made a Wii U RAM Expansion Pak?

Posted by Nintyfan86 on 25 June 2013 - 10:31 PM in Wii U Hardware

Pure capitalism for you, getting flopped:).



#226165 What if in a few years Nintendo made a Wii U RAM Expansion Pak?

Posted by Nintyfan86 on 25 June 2013 - 08:12 PM in Wii U Hardware

This is all fun to discuss, and I am certainly no expert, but I will parrot what will be said when the forum mods come in:

 

  • You cannot compare consoles to Windows PC's. 
  • Laws of diminishing returns are in effect, even if you you could. The same 7990 with 6gb of vram is not meant for 1080p, but multi-monitor. Heck, crossfire is not really meant for 1080p. Take an old 5870 with 1gb of vram, and you will still get great results at a true 720p. Making the jump to 1080p takes, for some games, more than a 5870, with more memory.
  • If you increase the target resolution, you dramatically increase the required resources.
  • Speaking of which, the vast majority of PS360 games are not 720p, but lower rez and upscaled. 
  • The Wii U currently has 1 Gig for games, the HD Twins have 512mb for EVERYTHING. PS3 even has split pools. That is 4x the ram for a target rez of 720p (true 720p). 

Now, let's look at the ram and specs for the PS4 exclusively for games (from what I have read on the netz, don't crucify me if this is off with ram and core usage).

 

7 gigs available, 6 cores @ 1.6ghz based off of a bulldozer (Trinity based?) mobile variant, and a custom AMD GCN chip, supposedly on par with a 7850. 

 

So, 1 gig and 2 cores are available for the OS. 

 

Now, knowing that we need vram, and a good bit of it for 1080p + (don't get your hopes up for resolutions higher than that, think of 4k as this gens 1080p), you can factor in as much as required for devs to get wherever they are going outside of a Windows environment. 

 

You guys really only need to look at Watchdogs, and its' graphical downgrades to sort of see what to expect. Then factor in the state of Wii U dev tools at launch. Think NFSMW U, X, Mario Kart, and the other upcoming games to see that the gap will be close enough, as long as the Wii U starts selling and software goes with it. 

 

I still say get both;). Stay away from the VCR though. Even if it tells you it has changed its quality to reflect Betamax standards, it has still revealed the true integrity of the medium:P

 

I welcome corrections. 




#274943 Bayonetta 2 1080p REALLY?

Posted by Nintyfan86 on 27 February 2014 - 12:28 PM in Wii U Hardware

I have been playing DK:TF lately. That game has me thinking of actual gameplay, and not looking for graphical items to compare. 

 

Platinum makes the same types of games-the kind that take you back to a period of time to which you were oblivious to this stuff. Ikaruga and Assault Android cactus do this for me as well. 

 

I guess I am suggesting that this concern over resolution is a bit like wondering how a bourbon was aged, and becoming a snob about it. When you were younger, you drank and had fun, but when you became a liquor snob, the inherent joy was lessened. 

 

The same thing appears to be happening with the gaming community. 

 

Sorry for being off topic. I guess I can always play Bayo 2 on my PC to get my graphics fix-oh wait...




#278898 Assassins Creed Comet WiiU (among others) Rumor

Posted by Nintyfan86 on 28 March 2014 - 11:06 AM in Wii U Games and Software

I am excited for more of the Kenway story. That said, it would be a better book than game at this point.

With the gameplay, I would like to see more Dues Ex scenarios. This linear stuff, while story driven, is terrible for replay ability. The only point to replay the game is for collecting items and being better equipped. It can be fun, but it can also be a whole lot better.




Anti-Spam Bots!