Jump to content


Stewox's Content

There have been 102 items by Stewox (Search limited from 20-June 20)


By content type

See this member's


Sort by                Order  

#84537 Rumor: WiiU Dev-Kit specs leaked ?

Posted by Stewox on 05 June 2012 - 01:38 PM in Wii U Hardware

I dont know what it means, but I am iffy on it being PowerPC based. POWER6 or POWER7 sure, but why use a six year old CPU? (I could be wrong, but they stopped using those around the time the 360 came out).


It's all custom hardware, it's based on that technology but it's updated and developed, we don't have all the details yet. It's is certainly not 2003 version slapped onto WiiU. We know it's modern hardware, 2009 and above.



#84520 Rumor: WiiU Dev-Kit specs leaked ?

Posted by Stewox on 05 June 2012 - 12:58 PM in Wii U Hardware

All in All ... this is way more than "slightly more powerful than X360/PS3" and crap like that.

Seems fake and weak to me. Current gen 1.5?


How more dumb can you be ? You've posted pretty stupid things in the past and i haven't quouted you out for those.


Validation:
Neogaf Moderator says it's "accurrate": http://www.neogaf.co....79&postcount=2
bgassassin confirms also: http://www.neogaf.co...&postcount=149ž


It's like half of my post doesn't exist, sish.



#84500 Rumor: WiiU Dev-Kit specs leaked ?

Posted by Stewox on 05 June 2012 - 12:03 PM in Wii U Hardware

It's separate NAND memory, so it's not part of the RAM.

On graphics, games will probably look about 3 times better if taken advantage of properly.
I'm not an authority on game console technology, I'm just familiar with how stuff works, the numbers mean etc.
So don't quote me on any of this. :P


Hehe, I had to quote you, i knew people are going to mix things as ideman and his poems weren't totally sure. Not your fault, from what i've read, I had no idea which memory they meant, from how it sounded, they meant RAM. But this is pretty hard to believe technologically.

So they were apparently talking about the 512MB RAM reserve, or they thought they were.

There is a separate non-ideaman rumor spec about a 512MB STORAGE reserve. Seems like you mixed these.

Now, looks like the STORAGE OS reserve is not taking away from the 8 GB Flash, but, we still don't have idea on the RAM side.

Better yet, there might be not 512MB RAM reserve at all if the neogaf ideamans totally took the previous rumors out of context and other majorit who has no idea, followed.

People don't have idea what they're talking about is the biggest source of false rumors and speculatino than the rumors them selves, perception, perception, perception.
--------------------------------------------------------------------------------------



WARNING: This specs are from the EARLY target specifications.

http://www.neogaf.co...6&postcount=149

This is pretty encouraging now, might make the UBISOFT rumor of 2 GB RAM and 4-core ... etc ... a bit more belivable.

This is based on the old dev kits as we know it was hinted that retail has 1.5 GB RAM, that's because of the 3GB devkits had, that was months ago, and even by then it was weeks or months old information, everything that we heard, points to that the final dev kit got a bigger specs bump, and that's exatly ties the UE4 rumors of "nintendo making sure UE4 runs" ... as silly as the title is as it's taken completey out of context, that means: SPECS BUMP , it does not only benefit UE4, it benefits obviously everything.

SUBJECTIVE OPINION: To me, this specs sheet looks like to have been constructed to PURPOSELLY hide the KEY specifications that would determine it's total performance. These are only the features of the hardware, and ladies and gentlemen, tesselation unit, compute shaders ... this is very good.



#84484 Rumor: WiiU Dev-Kit specs leaked ?

Posted by Stewox on 05 June 2012 - 11:24 AM in Wii U Hardware

Why I cant believe that? because the cpu wii U uses, have L1 L2 L3 cache...


That's why I added Rumor note.

Oh wait, 512MB for the Menu.
1.5GB ram is good then.

I guess unicorn tongue multiplier would be more like PS3 x2 maybe 2.5


Wait ... they might adjust that 512 MB OS reserve limit later



#84482 Rumor: WiiU Dev-Kit specs leaked ?

Posted by Stewox on 05 June 2012 - 11:22 AM in Wii U Hardware

Pretty disappointing.
Seems around PS3 x1.5


It doesn't mean anything ... these specs need to be analyzed. But to the point, the KEY specs are MISSING



#84470 Rumor: WiiU Dev-Kit specs leaked ?

Posted by Stewox on 05 June 2012 - 11:09 AM in Wii U Hardware

Or it's the end of NDA and Nintendo just doesn't bother releasing details.

Main Application Processor

  • PowerPC architecture.
  • Three cores (fully coherent).
  • 3MB aggregate L2 Cache size.
  • core 0: 512 KB
  • core 1: 2048 KB
  • core 2: 512 KB
  • Write gatherer per core.
  • Locked (L1d) cache DMA per core.

Main Memory
  • Up to 3GB of main memory (CAT-DEVs only). Note: retail machine will have half devkit memory
  • Please note that the quantity of memory available from the Cafe SDK and Operating System may vary.

Graphics and Video
  • Modern unified shader architecture.
  • 32MB high-bandwidth eDRAM, supports 720p 4x MSAA or 1080p rendering in a single pass.
  • HDMI and component video outputs.
Features
  • Unified shader architecture executes vertex, geometry, and pixel shaders
  • Multi-sample anti-aliasing (2, 4, or 8 samples per pixel)
  • Read from multi-sample surfaces in the shader
  • 128-bit floating point HDR texture filtering
  • High resolution texture support (up to 8192 x 8192)
  • Indexed cube map arrays
  • 8 render targets
  • Independent blend modes per render target
  • Pixel coverage sample masking
  • Hierarchical Z/stencil buffer
  • Early Z test and Fast Z Clear
  • Lossless Z & stencil compression
  • 2x/4x/8x/16x high quality adaptive anisotropic filtering modes
  • sRGB filtering (gamma/degamma)
  • Tessellation unit
  • Stream out support
  • Compute shader support
GFX API
  • GX2 is a 3D graphics API for the Nintendo Wii U system (also known as Cafe). The API is designed to be as efficient as GX(1) from the Nintendo GameCube and Wii systems. Current features are modeled after OpenGL and the AMD r7xx series of graphics processors. Wii U’s graphics processor is referred to as GPU7.
Sound and Audio
  • Dedicated 120MHz audio DSP.
  • Support for 6 channel discrete uncompressed audio (via HDMI).
  • 2 channel audio for the Cafe DRC controller.
  • Monaural audio for the Cafe Remote controller.
Networking
  • 802.11 b/g/n Wifi.
Peripherals
  • 2 x USB 2.0 host controllers x 2 ports each. (2 front, 2 back)
  • SDCard Slot.
Built-in Storage
  • 512MB SLC NAND for System.
  • 8GB MLC NAND for Applications.
Host PC Bridge
  • Dedicated Cafe-to-host PC bridge hardware.
  • Allows File System emulation by host PC.
  • Provides interface for debugger and logging to host PC.


Validation:
Neogaf Moderator says it's "accurrate": http://www.neogaf.co....79&postcount=2
bgassassin confirms also: http://www.neogaf.co....&postcount=149


www.vgleaks.com



#84456 1 thing about Nintendos E3 thats good is now I can say I TOLD YOU SO.

Posted by Stewox on 05 June 2012 - 10:53 AM in Wii U Hardware

It was confirmed 99% one year ago.

The mexican interview, reggie said "we'll see" with a smile, that's totally toward YES.



#84433 Hardware resource sacrifice for second GamePad

Posted by Stewox on 05 June 2012 - 10:32 AM in Wii U Hardware

I am screaming about that for about 1 year, nobody listened. And some people considered me as a hater... take that now (the result)... I was screaming about it, because I cared.

U know why I considered it bad?? Because the resources will be locked for the second controller... Its not like, "OK I AM GOING TO USE 1 CONTROLLER AND GET BETTER GRAPHICS or performance"... nooooooooo the resources are locked for the 2nd controller (dedicated cpu cores for controllers and dedicated cpu cores for the games, etc). Nintendo didnt listened the old nintendo generation (hardcore), they listened the kids...


Why would they do that

Developers would go ballistic ... we'll come to the time when they will have to unlock this.

This is logically totally stupid to do. Sooner or later they will be pressured by a developer, even the last E3 it was said that if it's a quiz game WHO care about the FPS, but you don't thave to LOCK it to 30 if it's above that.

And he didn't even said if that is for CONTROLLER or TV ... so ... guys , it's probably for controller :P



#84396 Hardware resource sacrifice for second GamePad

Posted by Stewox on 05 June 2012 - 10:00 AM in Wii U Hardware

Please do not take this out of context
http://www.neogaf.co...=1#post38559141

People have completely taken it out of context:

I don't have time for a deeper explanation now.


-it does not mean literally to "30 fps" ... silly ...
-it does not mean every game will be automatically locked at 30 FPS
-it doesn't mean you'll have "half" FPS on the controller (eg 15)


and who the heck is talking about FPS at all, stupid from point 1, the available FLOPS go down, so they'll have to adjust the graphics fidelity and effects.

Please cut the bullcrap, no developer will make games that will run at 15 FPS, this is ridicolous. They will optimize and push the hardware to it's limits and they will make great games with 1 or two of controlers, this will be addressed before you get your game into your hands.

Over and out.

EDIT: https://twitter.com/...042661158137858

Looks like they've started this, this is ridicolous. They won't lock the FPS like that for 3rd parties, this is complete bullcrap. Developers will probably revolt if this happens, FPS is dependant, and if a developer reaches great levels of optimizations, why on EARTH would he want to have FPS at 30 if there are

I am confident nintendo will not make a stupid thing of making an artificial locker - i have that more amount of faith, or if true, they might alread plan to lift the restriction.

So ... simply don't take it literally, nor seriously. It's silly, as a matter of fact, it's the silliest thing I have ever heard, that small screen is not going to take 50% of GPU's total performance that's ridicolous, unless the GPU is total crap but that shouldn't happen.



#84337 E3 HYPE THREAD

Posted by Stewox on 05 June 2012 - 09:13 AM in Wii U Games and Software

OMG OMG OMG

SUPERHYPED

LOOK METROID: "Back ... HE's baaaaaaaaaaaaaaack"


OIMGGGGGGGGGGGGGGGGGGGGG HHHHYPEEEEEEEEEEEEEEEEEEEEEEEEE



REGGIE SAID IT OMFG OMFG I EXPLODEDDDD


REGGIES SAID IT .... MY BODY IS READY !!!


YEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEES

I had to close the windows down since i TOTALLY exploded when he said it.

I have HUGE goosebumps .... I am cold ... this was Awe-freaking-some

GAF is probably going nuclear
------------------------------------


then

nothing

well ... it was pretty short

I wasn't expecting:
- no frist party core games (just one i think)
- no hardware
- no iwata
- no big trailer that shows actualy graphics to the tons of non-tech people

------------------------
Seems like tons of stuff is coming after the conference ... well ... i don't have that kind of time to sit around for all day. ... so .. later



#83741 Confirmed: WiiU GamePad has independent subsystem and OS

Posted by Stewox on 04 June 2012 - 04:05 AM in Wii U Hardware

Im not sure but in some of your early post u were saying that it wouldn't have a CPU. I remember u shooting my android rumor down quickly and one of your reasons was that the Wii U Pad didn't have a CPU. Is this not true? Are you being a hypocrite?


I meant GPU ... the ARM that's inside only displays the standard video output, the whole thing just does that, i have no idea if it's capable of anything more than just showing menus and config options and a few animations, it has no power to run anything significant, but devs say the specs are upped, we will see what are the capabilities and if hacked the machine could do more but the point is, they haven't even focused on anything independent at all, it's just not designed for that, so even if the horsepower might run stuff angrybirds, it's just what they want since they have the console and that's the genuis thing, the tablet is not going to consume that much power, LCD and the electronics, obviously the LCD being the top consumer of battery, but i surely want a lot of battery there, i hope it's at least 2000mAh or up , with an battery that's integrated and the size of the thing i really want something more to last, because decoding and wireless transmittion will eat up a big chunk too, not to mention rumble.

Stuff are missing, even the RAM amount is not that high to fit any sort of big app on there if we would try to force it, so cutting down those costs into better acessories on the controller and battery is way more welcome.



#83708 Confirmed: WiiU GamePad has independent subsystem and OS

Posted by Stewox on 04 June 2012 - 12:33 AM in Wii U Hardware

It is logic, because not all bigscreen menus would work well on a small screen.


That has really nothing to do with it.

Ofcourse i knew it has a CPU and other stuff required .. we just nevrr knew if thd subsystem was usable by public in any capacity.



#83673 Confirmed: WiiU GamePad has independent subsystem and OS

Posted by Stewox on 03 June 2012 - 10:59 PM in Wii U Hardware

Confirmed by iwata


just as i have speculated that there was a possibility for this, the controller definitely has ARM chip inside

currently the only app that it runs is the control center

dont expect any actual apps or games ... its probably very limited and totally designed just for initial config and maybe firmware updates.


hopefully we get more helpful stuffies at boot, some fiagnostic stuff for checking battery and if the cntrl is in range would be cool ... and other things i cannot think of right now.



#83193 WiiU CPU Revealed by Patent: "Enhanced PowerPC 750" ?

Posted by Stewox on 03 June 2012 - 12:19 AM in Wii U Hardware

Well ... seeing actual specs in a patent was a bit ... too good to be true.



#83191 Whats the possibility of the Wiiu Running Unreal Engine 4 with this information?

Posted by Stewox on 03 June 2012 - 12:03 AM in Wii U Hardware

unreal engine 4 compatibility is more about architecture than raw power, im pretty sure it will run it :)


That's what I've been trying to tell for a long time now. It's not just UE4, it's for every engine. If IDTech5 could run on X360PS3 what makes UE4 not possible on WiiU



#82776 WiiU CPU Revealed by Patent: "Enhanced PowerPC 750" ?

Posted by Stewox on 02 June 2012 - 06:36 AM in Wii U Hardware

i have a question if we were to assume this is true. does this mean nintendo can decicde at a later date to unlock the full power of the system to developers? or once they decide what they want they have to stick with that? is this not similar to what sony did with the PSP and unlocked more power later in it life span to developers from the cpu why the god of war game was visually stunning.


There is no reason for it to limit the performance artificially.

It doesn't run on batteries.



#82729 Developer: “We Won’t Be Working On Wii U Due To Its Complexities”

Posted by Stewox on 02 June 2012 - 04:13 AM in Wii U Hardware

1. IGNorants
2. stupid developer
3. low skill developer
4. noob developer
5. the most stupidest statement ever said

listen to the NAMED developers who already are making games, their statements are opposite



#82723 WiiU CPU Revealed by Patent: "Enhanced PowerPC 750" ?

Posted by Stewox on 02 June 2012 - 03:32 AM in Wii U Hardware

Change the title, this is obviously fake, the PowerPC 750 is what the GC/Wii CPU was based on.


Really ? Link me to it



#82532 WiiU CPU Revealed by Patent: "Enhanced PowerPC 750" ?

Posted by Stewox on 01 June 2012 - 01:26 PM in Wii U Hardware

GPU is way better than expected if VGA ZOL is true. It sounds too good to be true to a lot of people.

It would run samartian demo 1080p 60FPS. (almost; let's take that 0.5 tflops off for optimizations possible on consoles)

Remember, Samartian Demo 2.5 TFLOP requirements were on the PC, way less optimized platform than consoles, and in 2011... so that could all improve to make it doable.

If this 2TFLOP thing is true, ... if ...

Sorry im no CPU expert, atleast on IBM.

EDIT:
Going to sleep now, link this thread on gaf if it didn't already by the tech guys there.



#82517 WiiU CPU Revealed by Patent: "Enhanced PowerPC 750" ?

Posted by Stewox on 01 June 2012 - 12:55 PM in Wii U Hardware

You could of just kept the post and updated it after so people would not be confused...


I tried to avoid that ... because i reported it elsewhere. The problem was, i have no idea how powerful these IBM stuff even is, iv'e mistaked it for Power7 and i thought this was some old patent about something else, seems like PowerPC and Power is not the same, i've came across PowerPC being some kind of old 2001 tech and i in panic-mode just deleted text.

So now is okay, fine.

Looks like the 2011 rumor was correct: http://wiiuconcepts....pu-exposed.html

And Vga Zol was also correct on the FOX: http://vga.zol.com.cn/233/2336171.html


So you see, VGA ZOL might be correct on the GPU now also :) ... but i don't think it's 2 TFLOP , i'll be cautious and make a range somewhere 1.5-2 TFLOPS and let's see what happens.


So ... THIS IS SOME horsing GREAT NEWS GUYS ! Other people will provide tech opinion on CPU, i don't focus on this area much, but if the vga zol is to be belived on the GPU, this is MASSIVE NEWS!

what? 68xx series vga card? thats 2.7 tflops! if they modified it a bit.. WOW thats NEWS... Hopefully it is 6850

as for the cpu.. 4 core and 8 thread.. NO WAY to be 200 gflops.. JUST NO WAY..


You shouldn't take the TFLOPs from the AMD site didn't i tell that allready. These names are meant to be equivalents, not actual cards put in there.



#82510 WiiU CPU Revealed by Patent: "Enhanced PowerPC 750" ?

Posted by Stewox on 01 June 2012 - 12:29 PM in Wii U Hardware

what? 68xx series vga card? thats 2.7 tflops! if they modified it a bit.. WOW thats NEWS... Hopefully it is 6850

as for the cpu.. 4 core and 8 thread.. NO WAY to be 200 gflops.. JUST NO WAY..


Wait ... it might have been a false report, because the patent is still being read.

I will update thread when im sure im sure.



#82506 WiiU CPU Revealed by Patent: "Enhanced PowerPC 750" ?

Posted by Stewox on 01 June 2012 - 12:17 PM in Wii U Hardware

Hey guys.

I have also reported an extensive email to gonintendo.com and mynintendonews.com , both thave not reported about the VGA ZOL BARTS rumor spotted by B3D member and posted by bgassasin on neogaf, which was quickly buried by ignoring public. My point was about the range of the GPU could be in there in terms of TFLOP, i know all these HD6XXX names are all equivalents for comparrison we use but if they use it seriously then they surely have no idea what they're talking about.

VGA ZOL Rumor: http://vga.zol.com.cn/296/2968963.html

Translation Here: (from the email i created)

Title of the article: With the Nintendo Barts chip WiiU rumors the new specifications

First Sentence: We have previously reported, the game developers and game developers to Nintendo, Nintendo raise Wii U gaming hardware specifications. //COMMENT: Yes this is true, this made to neogaf, it was reported as "nintendo trying to make sure UE4 works" (not literally, nintendo cannot influence 3rd parties engined directly unless it's a hack/backdoor) which technically means buffing the specs hardware-wise which means that Epic probably pressured them to do so.

Second Sentence: Now, we have a Christmas this year, the host of the latest hardware specifications.

Third: Updates to the hardware specification, code named Wii U R&D is the date of listing, Cafe" around Christmas 2012.

All about the specs: ( I have fixed some of the word formation, I changed only what Im 100% and I am familiar with tech)

Wii IBM PowerPC U processor with 32-bit, 4 IBM Power 7 core, 8 thread. Approximately 200 GFLOPS computing capacity.

Wii AMD U graphics system with the 40 NM GPU Radeon HD 6800, based on the
Barts core, VLIW5 architecture, a 1120 D stream processors, and 5 computing capacity TFLOPS approximately 2.

Built in 2 Wii U GB shared memory, no hard drive, integrated optical drive.
//COMMENT AHA no hard drive, google translate was bad here. I think this rumor is getting fairly belivable ... no hard drive is what was rumored before and expected, external HDD will be possible as nintendo said.

Taking into account the Epic Nintendo to pressure to support Wii U DX UE 4 engine 11 and the hearsay, it is clear that Barts level 6800 series graphics cards can run 1080P UE4.



THE POINT IS now ... why VGA ZOL looks so unbelievable, because they don't know any of these details, they just looked up the webistes of IBM and AMD and just copying over the PC hardware specifications, that's stupid, that's why the 2TFLOP, VLIW5 and all other things came from. I don't know from where all of these rumors get the idea of nerfed cores and threads ... compared to what in the patent says "enhanced" , rumors might be correct on this one since patent is too vague ... enhanced could mean whatever.

VGA ZOL took all of the GPU specs directly from the AMD site, this is very wrong, the console chips are custom, ... almost everything is custom and there's nothing to do with PC, the sources who make there rumors most probably know they mean an equivalent in TFLOPs, they don't mean the ACTUAL card to be in there. For the range ... i don't want to exaggerate, but we already know it's probably "over 1 TFLOP".

We will not know until we have full specs, what was really VGA ZOLs info and what they copied off the existing specs from company websites. They're pretty stupid for doing this and not notifying.


Here IBM: https://www-01.ibm.c..._Microprocessor

This is where VGA ZOL might get their 32-bit thing .... which was very suspicious for me, but this goes contradiction to what we have in the patent which states "enhanced" ... I am no rocket scientist, but i don't see how 32 would be an enhancement over 64-bit. And it's ridicolous, nintendo 64 was called because it's 64-bit CPU DUH!



NEOGAF: http://www.neogaf.co...63#post38407963
Patent is here: http://www.google.co...e&q=112&f=false

Haven't seen anyone revealed this, someone might check around and B3D, but yes, I found it when readin the patent about memory and specs stuff that was in the figs.

UPDATE: DEC 2011 CPU rumors were accurate then (if this is true)
http://wiiuconcepts....pu-exposed.html

UPDATE2: Okay ... i knew something was too good to be true. I was hyped, blame me, i was too quick to report all of it, in the patent you can clearly see (e.g. an enhanced IBM Power PC 750) ... that "eg" might indicate an example, not the actual specs, so ... 50/50 now, who knows if they were serious or not. So i'll put a ? mark on the thread.

Actually the only thing that might back us up now is the previous rumors, but I wasn't focusing on CPU stuff ... CPU is the least problem for me, i worry about RAM.

UPDATE3: Okay a bit of a mixup here with the 32-bit VGA ZOL CPU STUFF; i have really no idea about IBM's PCs so i had to go check and learn everything while writting this thread, forget aobut that N64 64 thing because it looks like it's not an argument, Wii and GCN also got PowerPC tech in 32-bit ... I am not sure what the heck is going on with this bits and whatever, it's a clusterfick of numbers and I don't even know what to expect in terms of bits, if 64-bit ... no idea. If yes, good. but I don't really care anymore, it's late here and going to sleep ... i'll see later.


UPDATE4: There seems to be some other specs stuff in the diagram too, stuff like "DRAM Main Memory" is very interesting (might be a lot of it, and DRAM is fast so it's great news if true) ... , because MAIN memory is usually slower and a lot of it, ... not sure if we meant to take these diagrams seriously, but still ... it can be true, maybe they meant it as "main" internal or what, but rumors also reported 2 GB shared, however you know nintendo, DRAM is a expensiver.
Posted Image



#82065 Rumor: Wii U to support 4 U pads?

Posted by Stewox on 31 May 2012 - 12:43 AM in Wii U Hardware

If this is true I am going to be one of the losers to buy all 4 of them.


3 ... you get one with the system, remember.

One is probably 60-70$, the LCD it self probably wouldn't add that much up considering it's just a bigger plastic size and a few more buttons , but the NFC, magnetic sensor, lagless wireless capability and the batter will take a big chunk that'll make it twice as costlier as the WiiMote

In 1 to 2 years of time the cost will drop about 20$, the NFC hardware will get 4$ cheaper, the LCDs will also get cheaper a bit, and other misc chips inside will all add up a little, that's about it. Whether nintendo will lower price or keep that as profit margin we don't know, but they aren't evil, they always drop the price a bit.



#81918 Rumor: Wii U to support 4 U pads?

Posted by Stewox on 30 May 2012 - 01:19 PM in Wii U Hardware

After this(http://mynintendonew...by-ubisoft-dev/) recent leak, further review of the supposed GPU suggests that the Wii U MIGHT just support 4 U pads.

http://www.amd.com/u...overview.aspx#2

The Supposed GPU states that it has, and I quote, "

  • AMD Eyefinity multi-display technology
  • Native support for up to 5 simultaneous displays
  • Independent resolutions, refresh rates, color controls, and video overlays
  • Display grouping
  • Combine multiple displays to behave like a single large display"
[/list]While this isn't any confirmation in the slightest, it IS possible that if this is the GPU, that 4 U pads can be used at one time.

What do you guys think? Possible, or not likely in the slightest.



Even if the hardware technically is possible to do that its nothing guys, its not practical, at least not at the beginning as they will not focus on the casual and 4 player quiz games can wait.

Thats that .... the guy just took that info fromm AMDs site bwahaha ... a classic mistake, thats a PC card, it has nothing to do with the wiiu ... every rumor has just been using that for comparrison equivalent. lold

Even if they retain most features the original chip had ... even something on top uts still totally custom.



#81784 Wii u is apparently not supported by unreal engine 4

Posted by Stewox on 30 May 2012 - 05:44 AM in Wii U Hardware

Let's not forget the multiple powermac G5 top of the line computers linked together to show tech demos for Xbox 360 launch...

Optimization cannot be overestimated.

Quickly thrown together code and even graphics assets can be bloated so many times over, that it can be a mess in a hurry.

The wii u can easily handle the Samaritan demo graphics. Rest easy.


You are correct. The big gap difference in game quality between the release debut games and truly developed software which will come later in the console's lifecycle will be all down to the actual effort in utilizing the consoles resources and smert solutions to the technical obstacles.

A consumer cannot possibly claim what is optimized good or not by just watching the demo, as a matter of fact Frostbite 2 isn't nearly up to par with CryEngine ... let alone the code quality of IDTech5, the fact how buggy bf3 is doesnt help either, I've played PC version for 3 weeks and i was overwhelmed, one of the most buggiest games in years.


The Powermac g5 is very slow and also pairing 5 of them doesn't make any graphical demo any better. Pairing cpu's is only good for batch jobs such as pre render stuff.

And why would microsoft use macs for their demo!???

For realtime stuff, it's the graphics card that matters.

The super optimized version of BF3 runs at low 720p 30fps on ps3. While a PC with a midrange card runs it at high 1080p 60fps. Optimization can't make something that takes 3 gtx 580s into one 4870. That is just plain impossible. Of course the wii u may be able to run it at 480p 30fps or something but that would take a lot of optimization work.


You are wrong double-time, first of all you don't have the bf3 source code to be commenting how good it is optimized and to the relation to the PC version it's not the same build of the frostbite 2 engine, different levels of quality are in each of these versions as different teams work on it and programmer expereience in each platform differ, this is common in the industry and depends how company works that it effectively adjusts resources between tasks, most companies only have one senior programming team that has to direct and oversee all of the different platform-specific builds/edition of the engine, it's not the same engine code pasted on whatever platform, thats silly it simply wont work, those are the basics and you can find all on the net. Proprietary code is secret and anyone can brag about how good it is without having to prove it.

But the source of your second mistake is quite simple ... usually beginner assumption.
No, the Samartian demo doesn't require 3 gtx580 PC GPU cards, it was the system that was used to rn the demo, that is a collosal difference in context.
Playable status of the samartian demo requires around 1.1 TFLOP of performance, depenting on the platform and optimizations this varies, consoles allow full hardware access to any game developers, PCs don't. For windows PCs to achieve same level of performance in an example test demo, they require more raw power to offset the overheads ... please stop comparing these two very different worlds like its a walk in the park, and throwing some numbers around like it's a horse race.
My calculations made over time estimate that PC software loses about 40-60% performance into heat/thin air, because of abstraction, software code layers, proprietary API, different memory architectures, Proprietary drivers, OS code quality. In some areas that percent might be even higher. Why all this ... simply because hardware vendors dont put a lot of effort in their support software and/or firmware, lack of updates, they aren't game developers .... in the console space game programmers are able to override and bypass the API and write essentially write their own drivers and interaction with the hardware, otherwise nobody would buy consoles, it would run like crap.

Carmack from id software didn't offer any calculations or details in an interview, he only used an adjective, if the PCs would have the same abilities as consoles, the games would run "significantly faster".


That makes sense. I agree, it might get on Wii U since it's powered by the same engine as Rage. There's no doubt that the Wii U will be at list slightly more powerful than current gen, so I assume it will run both Rage and Doom 4 just fine.
I'm also excited about Doom 4. I've always been a huge fan of ID Software. They know how to make groundbreaking stuff.

May we expect to catch a glimpse of Doom 4 during E3 ?


IDTech5 was updated past Rage's release date, Doom4 is getting new features as well as new rendering engine code is going to be rewritten, to addition Rage was not targeted at graphics but performance, while Doom4 is going to be 3 times more graphically rich and will run 30fps in singleplayer on consoles at the time there was no talk about next generation.

Here's what we know about Doom4 however they haven't released any more info beyond that.
http://forums.bethsoft.com/topic/1334656-doom-4-list-of-known-technical-features/

This proves you how engine names are totally meaningless as code changes from game to game, It will still be called IDTech5 but for carmack its still just a modification ... no big leap. IDTech6 will have stuff like Ray-Tracing and more, which is considered a big leap.




Anti-Spam Bots!