I doubt Sony would throw GDDR5 in there if the latency was going to pose a problem.
As for why do PCs not have GDDR5 for system memory? I don't think latency is the big issue, its more fact that a PC has no use for system RAM with that kind of bandwidth. Its generally the GPU that needs high bandwidth and as all the good GPUs require hooking over PCIe, they wouldn't have access to the system RAM at full speed anyway which is why they have their own pool of GDDR5 memory on the card itself.
As PS4 is a fixed hardware design not needing to cater to the wildly varied PC workloads, it can go at the problem from a completely different angle to what PCs do. Both the CPU and GPU able to access that single pool of memory directly in a useful fashion and as you do not have to worry about a bloated desktop OS running in the background, any latency issues can be mitigated.
A big bottleneck on PC has always been having to copy data from system RAM into GPU RAM before the GPU can do anything useful with it. The PS4 would seem to eradicate that problem entirely.
There is every possibility that in fact PCs WOULD be better designed like the PS4. But in doing so you limit the user to the graphics chip and RAM that came with their PC, when the whole point of buying a PC is so you can pick and choose.
Yes the RAM could be modular, but the standards for DIMMS are set based on what is logical for the market as a whole, not just the small segment that is gaming. Can you imagine how much it would cost if they had to make game-specific chipsets that supports GDDR5 DIMMS? Then the cost of those DIMMS themselves as they wouldn't be sold in the same quantities as DDR3 so would cost many times greater?
You're right, no PC has a need for system RAM with great bandwidth but carp crap timings. PC's would perform like crap if they came with GDDR RAM used on the main bus. Think about the different types of operations a CPU and GPU perform, and why one would need RAM with the least amount of latency possible, and the other with the least amount of latency possible given a need for a super high frequency and voltage. I can turn my system RAM into GDDR5 by turning the frequency way up and the timings way loose in the bios, but that would be very bad for performance, anyone would rather take the MHz hit for tighter timings.
The PS4 (and the XBOX720) are going to be running a LOT of non game code at all times, those tasks are not suited to their RAM setup. No one is saying it isn't possible to run the system that way. It can and will be done, it just isn't optimal. DDR3 would perform MUCH better as system RAM. I'm actually more impressed by the rumored 720 memory setup, if that SRAM is indeed embedded on die. DDR3 would cost less and perform better for general system tasks.
The PS4 OS will in fact be running a lot of overhead in the background, even during gaming. Latency will be an issue, but not one consumers will ever notice. It's great to have mega bandwidth until you lose half of the bandwidth to wasted cycles because the CPU is sending fetch and park requests to RAM constantly making the GPU wait a little longer to use it. It wasn't about Sony putting something in there that would be an issue, they went lazy on it, probably lowered the frequency considerably so they could tighten the timings a little because that can hurt game performance.
So that would explain why its so easy to port the pc versions of games over to Wii U. Like NFS most wanted.
Interesting concept, you have a point there.
Right, and why the 360 ports on Wii U don't run very well and freeze on occasion. They are CPU centered, and while the Wii U CPU holds up quite well given the system isn't designed for what they are doing, they would have seen much better performance if they took the time to do what Criterion did. As it was, though, Nintendo wasn't there for a lot of the third party devs when they were doing the ports, and the tools weren't mature enough, and the hardware wasn't finalized until shortly before launch.