@3Dude - I don't think AMD is the problem. They should have lowered the graphics performance, gone quad core, and raised the clock speed to crazy level.
AMD's not the problem, just their current product line post bulldozer snafu.
Thats not going to be easy dealing with a closed box. I can see why sony and MS rejected that path this time, looking at the abysmal failure rates of ps3's and 360's, and the electricity bills that came with them... in todays economy.
Another problem with going with 'crazy clock speeds' is the fact you need to increase cpu stages to get there. Kind of a problem, it has its pros and cons.
Jaguar already increased its cpu stages to support higher clockspeeds over bobcat, for a 10-15% performance improvement via higher clockspeeds. THat also increased branch mispredict penalties to 14 cycles wasted everytime a prediction misses. You want to beef clock speeds 3-4x, you are going to need to lengthen that cpu pipeline, and that mispredict penalty is also going to skyrocket, and soon you will be looking at up to 100 cycles lost per miss. Out of order helps, but those damn things stack up, and fast.
Oh, HERE!
Im sure you will have fun with this.
http://develop.scee....GC2013Final.pdfSony actually did a pretty good job bypassing most of these bottlenecks that would assuredly appear. By removing dx overhead, and using multithread rendering, theyve eliminated a lot of the bottlenecks they would normally run into. For example, they can blow draw calls out the wazoo compared to pc.