Jump to content


Goodtwin

Member Since 05 Mar 2013
Offline Last Active Nov 07 2014 08:51 AM

#300473 Fast Racing Neo Screens and Discussion

Posted by Goodtwin on 15 October 2014 - 02:14 PM

AMEN!!! The Wii U is fine for what it is...Nintendo games in full HD resolution!!! Let the fools still thing that the Wii U has "only 160 SPU"  and has less GFLOPS than the X360. The trolls will never see how good of a system it really is.  I for one have been enjoying Mario 3D World, MK8, W101, Pikmin 3, ZombieU (yes ZombieU), Child of Light, Rayman Legends, Trine 2 etc.  I am looking forward to picking up Bayo 2 , Hyrule Warriors, Smash, and Captain Toad.

 

Im not sure if your saying its not that big of a deal, or trying to say that the Wii U doesnt have a 160 SPU GPU?  Its been pretty well proven that the 160 SPU theory is more than likely accurate.  All you need to do is look at the AMD HD6450, a 160spu GPU with 8 texture units and 4 rops (Wii U's GPU supposedly has 8 ROPS), but overall is very close to the Wii U's suggested specs, and as you can see from testing done, it can outclass the 360/PS3, and thats in the PC environment.  Take a look at this if your skeptical.

 

http://www.techpower..._Passive/9.html

 

People, and I am no exception got to hung up on the "flops" performance of the GPU.  The majority of developers gave positive reaction to the GPU and memory, it was the CPU that gave them some fits.  Time constraints and cruddy dev kits and support from Nintendo is what caused the underwhelming performance with the early ports, and then the low sales of Wii U resulted in publishers devoting very little resources to the Wii U ports.  Imagine if all the early ports had been as good as Need For Speed MW and Bayonetta, the perception of Wii U might be a bit different throughout the industry.




#264474 If the Wii U is not powerful...

Posted by Goodtwin on 02 January 2014 - 04:36 PM

With all these critics that use the lack luster ports as proof that the Wii U is a total dog, most of them believing that its actually less capable than the PS3 and 360, I would say to them, do you honestly believe if Bayonetta 2 were to be ported to the 360 and PS3 by an outsourced developer with a limited amount of time that the results would be great?  Anytime you have some serious fundamental differences, and software that is optimized for the strengths of one piece of hardware while avoiding the weaknesses, that may not transition very well to another piece of hardware, even if that hardware a bit more capable overall.  Imagine if the a game were developed for Wii U that really leaned heavily on the edram.  Im talking serious optimization to make the most of it.  The games fundamental foundation is built around it.  Now try and port that game to the PS3 and 360 in 6 months and let me know how it goes.  Would that make the Wii U significantly more powerful than the 360 and PS3?  No, of course not, but it goes the other way as well.  There are some serious fundamental differences, and its hard to leverage software designed around the strengths of another console. 




#264316 If the Wii U is not powerful...

Posted by Goodtwin on 01 January 2014 - 12:32 PM

The edram is currenty estimated to be either 35 GB/s or 70GB/s.  Im guessing 70GB/s because that would be in line with how they are using it.  Its enough for the frame buffer/z-buffer to never saturate bandwidth requirements, making bandwidth a non issue for the gpu.  This allows the main memory to act as a read only memory for the most part, meaning the 12.8GB/s to the DDR3 can be used for textures and assetts without sharing bandwidth with Z-buffer calls.  Whatever the real numbers are, it does not appear that the gpu is bandwidth starved.




#264203 If the Wii U is not powerful...

Posted by Goodtwin on 31 December 2013 - 01:45 PM

The jump made last gen was insane.  Not only in outright power, but the addition of programable shaders brought graphics to a whole new level.  Just adding more and better shaders doesnt deliver the same kind of leap.  Wii U is a good design, and developers that really try will find ways to optimize their code for the hardware over time.  PS3 and 360 have had over six years of optimization.  Wii U may not see an enourmous progression over its lifetime, but its foolish to think its maxed out already.    

 

The numbers can make it sound worse than it is.  PS4 and X1 may be 6-8 times more powerful than Wii U, but the 360 was about 20 times more powerful than the xbox, and the PS3 was about 30 times more powerful than the PS2.  The cycle didnt repeat, the X1 and PS4 are not near as cutting edge on launch day as their predecessors were.




#256031 COD Ghosts, how does it look?

Posted by Goodtwin on 13 November 2013 - 12:22 PM

Unless your looking to play one of the less popular modes, it should be easy to get into a match.  TDM is the most popular by far, but Domination and Free for All usually have quite a few people playing as well.  Trust me, after Christmas, the amount of people online will triple, possibly even more.  Some of the more obscure modes will always be a little light on players, but the majority of modes should be easy to find full matches after Christmas.




#255919 digital foundry: COD:ghost face off Wii U version?

Posted by Goodtwin on 13 November 2013 - 07:26 AM

Anybody who buys COD every year knows the first two months are beta edition.  LOL  Its said but true, the games come unfinished every year.  Its just the way of things these days.  Patches are great, but it means developers can get away with shipping less polished software are launch.  A_Trey_U is saying we should see a noticeable improvement with this upcoming patch. 




#254795 COD Ghosts, how does it look?

Posted by Goodtwin on 06 November 2013 - 10:06 AM

Honestly it looks pretty good.  What surprised me was how clean the image was.  I personally didnt notice any jaggies, and the resolution look better than expected.  Whatever form of AA they are using makes for a very clean and clear image.  I think Black Ops 2 used post processing AA, and that tends to blur the image a little bit.  I am guessing Ghost is using true 2x AA, and the Wii U's scaler must do one heck of a good job converting it to 1080p.  This is not a gorgeous game, its a good looking game, but not great.  There are some lower res textures and the character models are lower poly than BO2.  The maps are much bigger than BO2 though, and have tons of assets in them.  Ghost wont go down in the Wii U's history as one of its prettiest games, but its not ugly.  It looks like Modern Warfare 3.   




#253757 Assassin creed 4 is out on wii u if anybody noticed.

Posted by Goodtwin on 30 October 2013 - 06:48 AM

So far it sounds like a very respectable port.  Its not really outclassing the 360/PS3 build in any significant way, but at least its another solid release for Wii U.  Between Batman Origins, Deus Ex, and Assassins Creed 4, it looks like Wii U is getting respectable versions of all these games this year.  Wii U will only shine with exclusive software.  Hell, even AC4 on PS4 doesnt look that much better than the 360/PS3 build. 




#253753 Nintendo Sales Report Q2 2013

Posted by Goodtwin on 30 October 2013 - 06:23 AM

Its still the best selling next gen system thus far. :P  Its bad, there is no denying that, but I think the only thing to really say is enjoy your Wii U.  There are tons of good games out on Wii U right now with tons of good games coming next year.  Does the fact that only 3.91 million people own a Wii U make it any less enjoyable for you than if 10 million people owned one?  Of course not.  The support this year is still good from third parties, and Nintendo has some solid first party releases.  Im not saying that the Wii U is only going to sell Dreamcast numbers, but even if it did, there are plenty of people who think the Dreamcast is one of the greatest consoles ever made.  Hardware sales wont change the fact that Wii U will have some of the best games of this generation on it.    




#249947 The Wind Waker HD only took 6 months to develop

Posted by Goodtwin on 06 October 2013 - 06:37 AM

Nah.

 

The original doesn't dip below 30 fps. 

 

:laugh:

 

You havent played the original then.  The Wind Waker HD is definately no worse than the original in the performance department, but it should have been 100% rock steady at 30fps with no slowdown, thats my only gripe with the game.  Every piece of slowdown in WW HD is carried over from the original, I have a feeling it has more to do with the game engine bottle necking.  Either way, its minor blemish on a fantastic game.  The game isnt over priced, you see that on ebay the original for Gamecube still sells for $30+ used.     




#248273 X1 and Wii U Actually Share a lot in common

Posted by Goodtwin on 27 September 2013 - 06:10 AM

I think Microsoft looked at it from a performance standpoint, and determined that the performance would be very good by going this route, they would have 32MB sram for the bandwidth hogging frame buffers, and then have plenty of low latency DDR3 ram for the main memory pool.  People dont always understand that while the high bandwidth of GDDR5 makes it the memory of choice for GPU's, it also have very high latency, and thats not so good for the CPU.  In that article Microsoft talked about latency issues, and how memory latency tends to be a bottleneck for GPU's.  I think thats why they stuck with the setup they chose.  The 32MB of sram not only gives them tons of bandwidth, but also low latency, especially when we are comparing the sram to GDDR5.  Both designs yield very high performance, but it seems that the majority of developers prefer the more straight forward approach of one pool of GDDR5, but that doesnt mean that it is more capable, just that its less involved to get the performance.   




#247640 What's the General Consensus on ZombieU?

Posted by Goodtwin on 23 September 2013 - 09:23 AM

I think its the best launch title for Wii U.  It actually uses the Gamepad in some very interesting ways, and its a true survival horror game.  It doesn't use a ton of cheap scenarios either, like a bunch of blind corners with a Zombie waiting for you there.  Its not going down in history as one of the greatest games of all time or anything, but its a solid game for true survival horror fans, and did an exceptional job with the gamepad.    




#246424 Exactly how much of a downgrade would a PS4 game suffer on Wii U?

Posted by Goodtwin on 17 September 2013 - 05:35 AM

Your average gaming enthusiast only knows what their favorite news outlet tells them.  If its trendy to talk about polygons per second, then thats what the media will talk about, and lately flops are all the rage, so thats what has taken center stage.  If people were to do any research at all, they would see that not only does 10x the performance not yield a linear improvement in graphics, but that the jump from the PS2 to the PS3 in flops performance was far more significant than the jump from PS3 to PS4.  The PS2 was about 6 gflops compared to the PS3's cell+RSX 250-500 gflops.  We were talking a 50x jump in flops performance from the PS2 to PS3, and now we have a PS4 that is a 6x jump over the PS3.  In outright performance the new consoles simply arent the same enormous jump that we had the previous two generations.  So even the spec that has been singled out as the be all indication of performance simply isnt that much of a jump.  It sounds like it is, until you realize the PS3 was 50x that of the PS2.  All of a sudden 6x doesnt seem so significant.  The feature set on the modern gpu will be more significant.  All those new DX11 shaders and features will be more pronounced than the increase in flops performance. 




#245811 Bayonetta 2, is it an on rails shooter or open world play?

Posted by Goodtwin on 13 September 2013 - 09:46 AM

Ah, you just saw it huh? Quite the difference between the crapped out ps360 ports isnt it?

Id imagine it would be hard to believe if all you saw prior was the port poop.

That level is a bayonetta vehicle level. Look up the bayonetta clock tower level, or highway 666 to see what platinum could do with ps360, at half the fps.... And a free camera as opposed to fixed camera (camera ai rotates based on bayonetta and enemies position, instead of staying still to ensure performance)

The game is neither on rails nor open world. Its a beat em up/hack in slash like dmc, or god of war.

Now do yourself another favor and look up monolithsofts 'X' E3 teaser.

THAT one, IS open world. Completely seamless open world. And its just the beta.

 

 

Holy smokes, when you actually watch Bayonetta 1 vs Bayonetta 2, its a very noticeable jump.  I understand that Platinum would have likely been able to make Bayonetta 2 look better than Bayonetta 1 even on the 360 due to game engine improvements, but not the type of jump we are seeing on Wii U.  Platinum is pushing the Wii U.  When you look at the whole package, not just the improved graphics, but the improved framerate and likely native resolution, its a very large bump from what they were able to do with current gen hardware.   




#245641 Exactly how much of a downgrade would a PS4 game suffer on Wii U?

Posted by Goodtwin on 12 September 2013 - 08:03 AM

Is it just me or is everyone in complete denial?
Look at the day one releases for xbox one and ps4, im a nintendo guy all the way, and I know graphics arnt everything but come on, they look incredible.
If wiiu can easily keep up with games like that, why haven't they?
It will be a noticeable downgrade and id rather they didn't bother and just gave us original games made for wiiu.
I mean talk is cheap, go to you tube and watch some of these new games.
Who remembers when we were all hyped up as need for Speed most wanted on wii u had "pc textures" lol I just saw the new forza on xbox one being played, now I feel a bit silly ...



Need for Speed MW was ported by Criterion in about 3 months, and looks nearly identical to the PC version.  The catch here is that the PC version isnt exactly a high end PC hardware taxing game.  So even if Criterion were to have gotten the game to run the PC version on Ultra 1080p, it wouldnt have looked much better than it already does.




So yes Wii U is going to be at an obvious disadvantage compared to the PS4/X1, but that isnt to say that developers arent going to be able to leverage the Wii U's abilities in ways that arent fully understood just yet.  

Wii U eDRAM usage is comparable to the eDRAM in the XBOX360, but on Wii U you have enough eDRAM to use it for 1080p rendering.
In comparison, on XBOX360 you usually had to render in sub 720p resolutions or in mutliple passes.
Even if you don’t use MSAA (MultiSample Anti-Aliasing) you already need around 16Mb just for a 1080p framebuffer (with double buffering). You simply don’t have that with XBOX360 eDRAM. As far as I know Microsoft corrected that issue and put also 32MB of Fast Ram into their new console.


I wanted to revisit what Shin'en said because I believe this could be relevant to the idea that there is likely resources going to waste.  I have a feeling that developers are going to start to figure out that 1080p on Wii U is actually pretty easy to pull off on Wii U.  Because all the buffers are kept in the edram, super fast with extremely high bandwidth, I have a feeling that a lot of developers are going to figure out that resolution is less likely to be a bottle neck with Wii U than it was with the PS3 and 360.  Optimizing the rendering pipeline to maximize the edram's usefulness will likely play big dividends.  This may be easy for a company like Shin'en because they are in complete control of their game engine.  Remember when Shin'en spoke about proper use of the edram and cpu caches?  If you arent using  them effectively, there is a monumental amount of performance down the drain.  This may be a very difficult task for the majority of game developers who use an existing game engine.  Is the developer even in control of cpu caches and the edram when using these middleware options?  Even if they are, does your average third party developer who is simply porting software have the skills to fully understand how to optimize cache and edram performance?  Its all about betting the most out of the hardware, and its obvious that the Wii U's peak performance leans very heavily on the cache and edram.  This is not a normal scenario for game engines that are developed around the PC architecture.  Edram on the GPU for example is not used in the PC world.  

Again, I am not trying to say that the Wii U is in any way on par with the X1 and PS4, it doesnt match those is outright performance, only that there is still plenty of untapped potential on the Wii U.  Now the developers just need to go find it.




Anti-Spam Bots!