Say a developer comes along and decides to port a PS4 game to Wii U, how big of a difference would the two versions have? Would there be framerate drops, would there be tearing, would there be longer load times, would there be n64-like textures in some spots? How much would need to be sacrificed in order for the Wii U to handle it?
Exactly how much of a downgrade would a PS4 game suffer on Wii U?
#1
Posted 08 September 2013 - 07:30 AM
#2
Posted 08 September 2013 - 07:39 AM
I'll let you know the moment the PS4 comes out and if there's a port of one of its games coming out on the Wii U.
#3
Posted 08 September 2013 - 07:45 AM
- Szayel likes this
Warning: Cannot modify header information - headers already sent by (output started at /home/thewiiu/public_html/ips_kernel/HTMLPurifier/HTMLPurifier/DefinitionCache/Serializer.php:133) in /home/thewiiu/public_html/ips_kernel/classAjax.php on line 328
{"success":1,"post":"\n\n
\n\t\t<\/a>\n\t\t\n\t\n\t\t\n\t\t
\n\t\t\t\n\t\t\t\t
#4
Posted 08 September 2013 - 07:45 AM
Lets take Battlefield for example, you have a giant map filled with buildings, grass, detail etc. The only real thing taken away would be the amount of grass on the field (The Ground Clutter) and tone down a tiny tiny bit on the detail. Overall, it wouldn't effect your personal gameplay experience unless you're a graphics whore. The game will run just as good as other consoles, just slight alterations.
But for less detailed games like fighting games, there wouldn't be any downgrade at all since their isn't really much there to begin with.
Edited by Chrop, 08 September 2013 - 07:48 AM.
Well, I've finally found my Starfox, and I love it.
#5
Posted 08 September 2013 - 07:59 AM
But for less detailed games like fighting games, there wouldn't be any downgrade at all since their isn't really much there to begin with.
That's not necessarily true. Look at games like Mortal Kombat 9 and Dead or Alive 5. The ports they received on the Vita took a massive hit in the graphics department. Fighting games are becoming very detailed.
#6
Posted 08 September 2013 - 08:05 AM
That's not necessarily true. Look at games like Mortal Kombat 9 and Dead or Alive 5. The ports they received on the Vita took a massive hit in the graphics department. Fighting games are becoming very detailed.
You do realize the Vita is a handheld right? Ya know, handhelds are normally weaker than consoles. thus receive weaker ports. The Wii U is a console almost as good as the PS4/One hardware wise.
The Vita is slightly more powerful than the Wii but a lot less powerful than the PS3.
- Structures and itsdavebaby like this
Well, I've finally found my Starfox, and I love it.
#7
Posted 08 September 2013 - 08:14 AM
You do realize the Vita is a handheld right? Ya know, handhelds are normally weaker than consoles. thus receive weaker ports. The Wii U is a console almost as good as the PS4/One hardware wise.
The Vita is slightly more powerful than the Wii but a lot less powerful than the PS3.
For a handheld that has been called a powerhouse by many, you'd think it'd be a simple step below a PS3. My point was that you'd definitely see differences if the game was made on the most powerful console, and ported down to the weakest. Fighting game or not.
#8
Posted 08 September 2013 - 08:18 AM
most would probly be downresed from 1080p to 720p, have lower drawdistance, and minor details removed(water quality, amount of partical effects). there will likly be longer load times, but it shouldnt be to big of a issue.
now if the port team is complete rainbow, then it will be alot worse than the downgrades above. enough to actualy effect gameplay.
#9
Posted 08 September 2013 - 08:28 AM
For a handheld that has been called a powerhouse by many, you'd think it'd be a simple step below a PS3. My point was that you'd definitely see differences if the game was made on the most powerful console, and ported down to the weakest. Fighting game or not.
You make it sound like the PS3 to the vita is as comparable as the PS4 is to the Wii U, they're not, the Vita and PS3 have a wider gap in power compared to the PS4 and Wii U, they bearly even have a gap, ya it has slightly better graphics and ram but they won't be fully utilized until 6 or so years time. They only just managed to get the best out of the PS3 this year, now with the Wii U being more powerful than it, i'd say they will be able to get 5 years with having pretty much the same game on each console before the PS4 games start to have a noticeable difference to the Wii U games.
- Szayel likes this
Well, I've finally found my Starfox, and I love it.
#10
Posted 08 September 2013 - 09:38 AM
probably depends on how hard the dev is trying because look at skyrim on ps3 it sucks but if they really tried it would probably have little change really unless it was probably something really intense like something specifically made for the console like the engine was specifically made for ps4/xbox one
- Rockodoodle likes this
#11
Posted 08 September 2013 - 09:39 AM
Tons of slow downs , graphics drop, missing extras,
I say all of those because they devs don't even give it there full when it comes to porting to the Wii u. Also the code that they use isn't at it's best yet.
Take Gta v. It's at the end of the ps3/x360 life and it looks great compare to Gta 4m better coding and you can work the system better.
With that said I have been seeing a lot of devs being lazy with there ports.
Plus the ps4 and x360 will have more specs so it will be a lil easier for them to make stuff.
xile06 "N" ID
Cod bo2, AC3, Zombiu, NSMB, Pikmin 3, Wind Waker HD, Smash Bro wiiu,
Preorders : ..
Plan to buy : -
#12
Posted 08 September 2013 - 10:39 AM
I don't think they would be much of an upgrade, if at all, unless the xbox 1/PS4 is excessively easy to program and the developers either don't have the time or are entirely too lazy to take the measure to make the Wii U version port of the game.
#13
Posted 09 September 2013 - 07:08 AM
I would have to say that Battlefield 4 is showing exactly what kind of downgrades would need to take place in order to get the game running on the Wii U. Some people are going to try and claim that its not a good indication because Battlefield 4 is still being developed with current gen in mind, but thing about it this way, if this game is using the new Frostbite engine that will be EA's next gen game engine, and it cant run that game in 1080p 60fps even though its scaled to still work on the PS3/360, then that doesnt exactly give me the indication that the PS4 and X1 can both increase scale and graphic fidelity at the same time. I think a lot of people are under this false assumption that the next gen consoles are going to increase scale, bump resolution, and bump framerate all at the same time. This is simply not the case. Your going to see compromises in one or more of these areas whenever a developer trys to push envelope in any given area. Sure, the scale of a game can go way up, but dont try to couple that with high graphics fidelity, 1080p, and a 60fps, it simply wont happen. If developers want a PS4/X1 game on Wii U, they can go it. Sure there will be compromises, but there is plenty of room to do it and still maintain the core experience. The easiest things to compromise is resolution. If the game runs in 720p or 1080p on the PS4/X1, then they could run that game in 600p on Wii U to free up the resources. So there is no doubt that we are talking about noticeable compromises on Wii U, but its not like it cant be done. The question is do publishers see much opportunity for profit on Wii U to even bother? It comes down to dollars, and right now the Wii U doesnt look very lucrative to a lot of publishers.
#14
Posted 09 September 2013 - 05:13 PM
I will not die until I achieve something. Even though the ordeal is high, I never give in. Therefore, I die with no regrets~Ikaruga Prologue
http://fc05.devianta...ask-d5k49sd.jpg
#15
Posted 09 September 2013 - 06:59 PM
Lets take Battlefield for example, you have a giant map filled with buildings, grass, detail etc. The only real thing taken away would be the amount of grass on the field (The Ground Clutter) and tone down a tiny tiny bit on the detail. Overall, it wouldn't effect your personal gameplay experience unless you're a graphics whore. The game will run just as good as other consoles, just slight alterations.
But for less detailed games like fighting games, there wouldn't be any downgrade at all since their isn't really much there to begin with.
Seems about right, and most graphics-junkies usually go for ludicrous $900+ PCs and brag about them on /v/ anyways. But many fighting games do have quite detailed graphics from what I've seen.
#16
Posted 10 September 2013 - 09:30 AM
why would games need to be downgraded to be on the WiiU ?
- Robotic Sunshine Commander likes this
#17
Posted 10 September 2013 - 09:36 AM
Less RAM, fewer CPU cores, weaker GPU.why would games need to be downgraded to be on the WiiU ?
I will not die until I achieve something. Even though the ordeal is high, I never give in. Therefore, I die with no regrets~Ikaruga Prologue
http://fc05.devianta...ask-d5k49sd.jpg
#18
Posted 11 September 2013 - 06:25 AM
Just the fact they hid the part where the cpu is a beefed up Wii processor. ... which was a beefed GameCube processor. .. is a problem.Less RAM, fewer CPU cores, weaker GPU.
When we have to wait for the public to reverse engineer the thing shows how embarrassed they were about there product.
Edited by Dharmanator, 11 September 2013 - 06:25 AM.
#19
Posted 11 September 2013 - 07:13 AM
POPULAR
No, nintendo stopped releasing specs once they found out the public was too stupid to understand them. There are still morons convinced the ps2 is more powerful than the gamecube.Just the fact they hid the part where the cpu is a beefed up Wii processor. ... which was a beefed GameCube processor. .. is a problem.
When we have to wait for the public to reverse engineer the thing shows how embarrassed they were about there product.
Using your logic you might as well call i7's 'beefed up pentiums'. There are nowhere NEAR as many 'new' cpu designs as you think there are. They are near ALL old designs 'beefed up'. In fact, older designs are revisited all the time to see if new technology can bring them back better than newer onez.
The g3 was a damn good processor core, that was shelved because the technology of the time couldnt mantain cache coherency of such a short piped architecture (no multicore), and couldnt increase clock speed, again, because it was such a short piped architecture. The g4 was a g3 with an altivec simd unit bolted on, and the g5 was a red hot disaster.
Fabrication technology has shrunk enough to allow clock speed increases, and edram technology improvements have managed to allow practical cache coherency for multicore.
The instructions per clock on this design is still damn impressive to this day. Literally sitting just behind the jaguars used in xbone/ps4. At around 2.4 vs 2.7 And thats just using stock 750cl ipc. We have no idea what kind of increases in ipc the increase in size to the MASSIVE core caches in espresso have made. But we do know the smaller cache increases from bobcat to jaguar improved ipc 15%.
So yeah, its sitting pretty in ipc even next to ps4/xbone.
Its biggest weakness is its complete lack of modern simd, which is pretty huge today.
Yet the little Mcguffin doesnt seem to even care. Its destroying simd resource hogs and surprising devs who actually try left and right. This isnt just surprising, it shouldnt be possible.
RAD tools just stated they were completely taken by surprise when they found nintendos custom 750 design can run, in full hd the simd or die video codec bink 2. They were expecting to have use the gpu to make up for the lack of simd. But no, the little non simd cpu just eats through it. Hell, BROADWAY could do it too! (640 480 of course)
WTF? HTF?
Simd is like a factory line, it works by being... say, thousands of little processors doing different parts of the same job at the same time, to speed up the job. You can see why it would be considered so important.
Nintendo's custom ppc750 doesnt have the factory... But its still doing all the work in time.
http://www.develop-o...nterview-Bink-2
Bink 2 simd, or go kill yourself.
Bink video change log.
http://www.radgameto...com/bnkhist.htm
Added Wii-U support for Bink 2 - play 30 Hz
1080p or 60 Hz 720p video! We didn't think
this would be possible - the little non-SIMD CPU
that could!
The notion of a non simd cpu muscling through this is ridiculous... No wonder they were surprised. I still have trouble believing it. Could have sworn it would remain bink1 because of its weak simd, or need gpgpu assistance. I always wondered why nintendo didnt add simd to the design. I thought gpgpu was meant to pick up the slack, but there doesnt seem to be near as much slack as I thought.
Its just plain a great cpu core. And its now possible to build up on it again. Nintendo was actually pretty smart here. You can easily attach simd units add more cores, and after a die shrink increase clock speeds to keep it going, competently into the future. I dont think this gen is the last weve seen of nintendos custom g3.
And I dont think its a bad thing. Little bastards earned it.
- Happy Monk, Hank Hill, Arkhandar and 5 others like this
#20
Posted 11 September 2013 - 07:48 AM
No, nintendo stopped releasing specs once they found out the public was too stupid to understand them. There are still morons convinced the ps2 is more powerful than the gamecube.
Using your logic you might as well call i7's 'beefed up pentiums'. There are nowhere NEAR as many 'new' cpu designs as you think there are. They are near ALL old designs 'beefed up'. In fact, older designs are revisited all the time to see if new technology can bring them back better than newer onez.
The g3 was a damn good processor core, that was shelved because the technology of the time couldnt mantain cache coherency of such a short piped architecture (no multicore), and couldnt increase clock speed, again, because it was such a short piped architecture. The g4 was a g3 with an altivec simd unit bolted on, and the g5 was a red hot disaster.
Fabrication technology has shrunk enough to allow clock speed increases, and edram technology improvements have managed to allow practical cache coherency for multicore.
The instructions per clock on this design is still damn impressive to this day. Literally sitting just behind the jaguars used in xbone/ps4. At around 2.4 vs 2.7 And thats just using stock 750cl ipc. We have no idea what kind of increases in ipc the increase in size to the MASSIVE core caches in espresso have made. But we do know the smaller cache increases from bobcat to jaguar improved ipc 15%.
So yeah, its sitting pretty in ipc even next to ps4/xbone.
Its biggest weakness is its complete lack of modern simd, which is pretty huge today.
Yet the little Mcguffin doesnt seem to even care. Its destroying simd resource hogs and surprising devs who actually try left and right. This isnt just surprising, it shouldnt be possible.
RAD tools just stated they were completely taken by surprise when they found nintendos custom 750 design can run, in full hd the simd or die video codec bink 2. They were expecting to have use the gpu to make up for the lack of simd. But no, the little non simd cpu just eats through it. Hell, BROADWAY could do it too! (640 480 of course)
WTF? HTF?
Simd is like a factory line, it works by being... say, thousands of little processors doing different parts of the same job at the same time, to speed up the job. You can see why it would be considered so important.
Nintendo's custom ppc750 doesnt have the factory... But its still doing all the work in time.
http://www.develop-o...nterview-Bink-2
Bink 2 simd, or go kill yourself.
Bink video change log.
http://www.radgameto...com/bnkhist.htm
Added Wii-U support for Bink 2 - play 30 Hz
1080p or 60 Hz 720p video! We didn't think
this would be possible - the little non-SIMD CPU
that could!
The notion of a non simd cpu muscling through this is ridiculous... No wonder they were surprised. I still have trouble believing it. Could have sworn it would remain bink1 because of its weak simd, or need gpgpu assistance. I always wondered why nintendo didnt add simd to the design. I thought gpgpu was meant to pick up the slack, but there doesnt seem to be near as much slack as I thought.
Its just plain a great cpu core. And its now possible to build up on it again. Nintendo was actually pretty smart here. You can easily attach simd units add more cores, and after a die shrink increase clock speeds to keep it going, competently into the future. I dont think this gen is the last weve seen of nintendos custom g3.
And I dont think its a bad thing. Little bastards earned it.
Its just so sad people still believe this is just 3 Wii cores taped together. The ignorance of the internet gets worse by the day. And the gaming journalist are even more responsible by spreading misinformation.
2 user(s) are reading this topic
0 members, 2 guests, 0 anonymous users