The GPU will not be a 7970, for starters, at the very most we're looking at a 7770. Sony is big on putting out numbers that tend to not be true in real life situations, so I wouldn't even count on that at this point. They claimed RSX had 1.8TFLOPS as well.
You are right in assuming that they'll run into hardware bottlenecks long before memory bottlenecks, where those hardware bottlenecks are is anyone's guess at this point.
The reason MS went with a lower spec, and even Nintendo went with a lower spec is that much of that additional power will be wasted. PS4 having more power will not show on screen due to the diminishing returns we're seeing in the GPU space, and why GPU makers just add more shaders linearly along with more RAM for their more powerful cards. Those cards are MUCH more power hungry, run MUCH hotter, yet produce maybe a few more frames than a midrange or budget card. The benefit of too much power in a closed system like this is limited. You have to find a balance that works.
Exactly, Microsoft is an OS company, and that shows through brilliantly on the XBONE.
Nintendo is a game company, OS's and online infrastructures aren't in their natural wheelhouse, though I'm sure they're learning very quickly to make it their wheelhouse.
That is what perplexed me about the PS4 and GAF proclaiming this 7970m theory. Of course, the rumors of both machines using 7770 variants were out there for a long time before this, which had me believing that the Wii U would be a no brainer with regard to multi plats. However, GAF, as you suggested earlier, appears to be something like Gamefaqs for wanna be tech heads, with a microscopic percentage of actual knowledgeable people.
Your GPU comment appears accurate, especially with AMD's road map, and Nvidia's release. We are not really seeing quantum leaps anymore. Only newer, slightly more efficient ways to do the same thing slightly faster. The power draw may drop slightly on some cards, yet increase on others. It depends on whether or not it is a rebrand or an actual reconfiguration. In either event, it is as though a new GPU card in XFire or SLI is good for 3 years before there is even a reason to upgrade.
I'm going to go ahead and bring up the argument I used against a friend. We are not at the point of marginal returns. Console makers make their consoles less powerful so as not to lose so much money and then want you to believe that. Until we are at a molecular level in games, there will always be room for improvement.
But, the Tflops and cores! Honestly, the scary thing about the path of increased graphics on a linear scale will be production costs, and an inevitable situation where you get a few clones of 3 genres. To EA's credit, they tried some new things with Mirror's Edge, Dead Space, and a few others in 2009, but that was probably not the best time to do it.
If anything, those IOS games and Nintendo's Indie initiative is probably a very good thing. Looking at tech as a prop is, quite possibly, the best thing the industry can do. After all, when you watch a movie, do you go on a forum and argue about the technicalities of a movies' special effects vs another? Games and systems should, hopefully, reach that same standard. Art.
First question: OS, why was it in such a bad shape? Honestly, Nintendo sucks at OS creation. The Wii was vwry basic, and the Wii U was an upgrade to it's design. I think they should use Linux, but they haven't listened yet.
As for the developer tools, Nintendo had no excuse. I don't know why they didn't give proper help to the developers. They learned from that mistake, but it was too late.
Hopefully that helps.
After seeing routerbad's comments, I am not sure MS went with a lower spec. If the machine is a 7770 with the same CPU, and DDR5 memory vs embedded ram, a similar gpu/cpu setup, and the same DDR3, I am no so sure as to how the PS4 has a real advantage. It would be like giving a budget card 8 gigs of ram when it could not possibly use it. I do agree about diminishing returns. Sitting on a couch, 4-10 feet away, and with a controller will limit the need for a constant 60 fps.
I can see the potential with the OS. I really want to see the whole thing get as fast as it is when going from in game to the browser. I think this is possible. Do you?


