Wii u clearly doesnt use CUDA. But I have seen a few things. I know the wii u has 3 shader variaties, vertex and pixel of course, and then geometry. Geometry shaders are far more advanced than anything ps360 has... And can actually be used fairly flexibly for compute.... But are a bit dated compared to actual compute shaders.
Geometry shaders can indeed be used for their namesake. In fact they are particularly good at polygon subdivision (tesselation)... However, with how much the wii u has dedicated to them, using them for the whole geometry show would be a really bad idea. The large bulk of wii u's geometry should still come from the cpu.
What the gpu can do from there is further sub divide those raw cpu made polygons, smart use of this is going to be pretty key. Adaptive tesselation is a DUMB use of ANY manner of tesselation, and the fact its somehow become synonymous with tesselation itself irritates me to no end.
As for ps4xbone being pretty wasteful, I have to agree... On the cpu end. Those netbook cores arent that great to begin with, rely exceedingly heavily on concurency... When not all things (particularly things that show up pretty often in gaming) can actually benefit from paralellization, actually have at least 1 core taken away for non gaming smut... The only thing they have going for them over that roided out tricore 750 is an actual simd engine... And im becoming less and less convinced theres a huge difference their core vs core.Whatever floating point enhancements nintendo made to the 750 for the cube, and evolved into whats on those espresso cores seems to be taking care of business just fine.
It can run bink 2, which is, as advertised on the rad tools website, 85% SIMD, at full 1080p resolution. Whatever they did to that fpu is taking care of business just fine.
GPU end though, Its pretty overpowered by xboneps4, those gcn cores are not only greatly outnumbering the wii u's units, they are considerably more effecient than the vliw5 nintendo is using.... Unless nintendo has created a custom system for swapping under utilized/waiting on dependencies vliw loads that the worlds never seen.
And putting the entire system on a single peice of silicons makes for some nice effeciency as well.
I was just trying to say that I hope Nintendo has moved in a direction that allows them to utilize new techniques in utilizing GPU’s that have now become common in gaming. Such as using CUDA to program directly for the GPU vs having to pass all the code form the OS to the GPU.
I totally agree that it comes down to software or just hardware in making a successful system. Dreamcast how I wish you were here still…
I will bow out of this conversation since it seems your going a little too much into the nuts and bolts of GPU/GFX tech then my limited knowable. I will leave with just saying lets cross our fingers that Nintendo starts releasing big 1st party titles and not just dolling them one little by little.
- JoshZebra likes this