I wrote about optimization... But that's what we got now. I'm not an idiot.
Can you please quote only the part you're replying to.
It's probably one of the better cards out there. My computers got the 6750, and I can run a ton of different games with no problems. Of course, 6770 isn't top of the line either, but an affordable card that will be good for the next couple of years at the least.
Honestly, I don't believe this. Why would a dev risk losing their job and alot of money, all to tell us this info?
It's the best case scenario for WiiU - it's won't be better than this. People still behave like PS3 and X720 are going to have much better hardware ... X720 probably not, but we don't know about PS4.
It's over 1TFLOPs just as said by the AMD insider ... it's 1.3 TFLOPs ... 5+ times more than X360 and 20% less than X720, the X360 has 240 GFLOPS or 0.24 TFLOPS
Sometimes I feel it's too good to be true ... the list really does look like it's a list pulled from all the web speculation.
In response to the bold part? A CPU that runs at 4GHz inside a small console box would melt away the console. A Nintendo console priced at $450 when they're most expensive home console ever was the Wii ($250)? Complete BS.
Those arguments are invalid.
I already explained about the price, read my posts.
I have yet to explain the only thing left, the CPU.
Again, pretty easy, GHZ alone doesn't mean anything, it's not comparable with PC hardware, and 6 GHZ has nothing special about it either, we don't know NOTHING about the chip yet only the possible family and what was based on. The age when GHZ meant something is over, the architecture, build quality, stability, features, cache, bandwidths and node process are more important than raw GHZ.
We have no details about the CPU, as long as the CPU is not overheating, as long as the CPU isn't drawing too much power they would like, it can be 6 of 4 GHZ, the number alone doesn't mean anything, and it doesn't prove anything either. One dev kit had 6GHz then they tried with that to test, but the final will be clocked down to 4GHZ to be cooled better ... that's all theories, we don't know, I am building around the information we know, i am not making stuff up, and these are the most accurate speculations, you need knowledge, and most people don't have to even participate in these debates, I wouldn't even be on this forum, if I could post on neogaf, and Im posting on beyond 3D as well where the actual tech community is.
Instead of focusing on the data and searching additional clues, I'm wasting time trying to explain basics to the noobs. Well .. this is the last post.
EDIT: The big point is ... you need to find other reasons to accuse it as fake.
I pretty much can see from the get go, but it's a question who made this list, because there simply is nothing too wrong with the raw info.
The overall scope of the info needs to be taken into account, the fact that he provides release date and price is suspicious, but I just wanted to get clear on the fact that it might not all be knowledge it might be developers OWN
The other thing is ... it's a pretty one in a million, it has to be some stupid employee to leak and tell this bunch of stuff out, especially with this kind of language, or he was trying to be as believeing as possible, no idea, he might have meant the equivalent GPU or what ... I don't really focus trying to find out if it's fake or not.
Still I pretty much don't care about the validity of the poster, I don't care about the release date, I don't care about the price, I am all about the tech, and if that's the data is not suspicious and is in-line with our and my findings and expectations then im going to defend it regardless of the origin of the source.
If it is fake then so be it, but that DOESN'T MAKE The tech info less valid.
Fake or Not: I never care about price or release date, the only people who most care about this are the non-techs, they care about these things and games, i never care about it, it's not important to know, great games will come no question about it, it's pretty clear from the get go, I know they shown only 10%, you need to think people, start using the brains, It was clear UE4 is coming to the WiiU, it is clear that many of the 3rd party games are coming to the WiiU, and personally i don't even care about 3rd parties, I won't play them on WiiU, I have nintendo for First-Parties, I have PC for everything else; tech-wise it's more important how much RAM the machine has, that's death or life, the next thing is the RANGE of the wireless controller, if you can have it around the house it's perfect, I don't want to be going 5 meters into the kitchen and lose signal, it's about the tech, I don't want the battery to be el-cheapo that runs out of juice in 3 hours, it's about the tech, I want USB3 support so it won't take EONS to transfer data and load games from external HDD (USB2 is too slow, depends on chip quality too, 35MB/s to 40MB/s = max while USB3 is 400-600MB/s, a typical (external) HDD achieves speeds 70-90 MB/s, better models of HDD such as my WD1002FAEX can sustain speeds at 126 MB/s); It's about the tech, I want analog 5.1 sound output possible via the WiiU Multi AV out connector, not just HDMI, because I have my own 5.1 sorround speaker set, using the HDMI i can only connect to the crappy HDTV speakers which are stereo. It's all about the tech, I want the WiiU zapper to be boundled with the games which use them or the system at launch. It's all about the tech, i want to have ethernet port on the console instead of using usb adapter. It's all about the tech, I want to be able to watch youtube without the web browser crashing and running out of memory, It's all about the tech, I want the touch screen to be up to standards so it's not delaying and unresponsive as in many tablets, I don't want the touch detection to be buggy and trouble registering the gestures. It's all about the tech, I want the console and games to support 16:10 aspect-ratio not just 16:9. It's all about the tech, I want the console to display relevant system info, such as the resolution games are operating at any given time. It's all about the tech, I want the controller's LCD to be able to be disabled completely, I want the LCD resolution to be adjustable for developers, they can dynamically adjust it what suits best in any given time in the gameplay so the controller won't draw resources when it's not needed, so when you are reading something you can max out the LCD resolution because there is little activity on the main HDTV screen, but when you're in action you're looking at the maint HDTV, you don't need the LCD screen on the controller to display max quality resolution, or disable it temporairly(black screen with some info or some message displaying "stanby" or something... it's up to the developer, possibilities would be endless)
Edited by Stewox, 21 May 2012 - 04:06 AM.