A couple of weeks back a developer on Ubisoft's Assassin's Creed Unity was quoted that the team had decided to run both the Xbox One and the Playstation 4 variants of the game at 1600×900 resolution "to avoid all the debates and stuff." Of course, the Internet exploded in a collection of theories about why that would be the case: were they paid off by Microsoft?
For those of us that focus more on the world of PC gaming, however, the following week an email into the Giantbomb.com weekly podcast from an anonymous (but seemingly reliable) developer on the Unity team raised even more interesting material. In this email, despite addressing other issues on the value of pixel count and the stunning visuals of the game, the developer asserted that we may have already peaked on the graphical compute capability of these two new gaming consoles. Here is a portion of the information:
The PS4 couldn’t do 1080p 30fps for our game, whatever people, or Sony and Microsoft say. …With all the concessions from Microsoft, backing out of CPU reservations not once, but twice, you’re looking at about a 1-2 FPS difference between the two consoles.
What's hard is not getting the game to render but getting everything else in the game at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed a lot of optimizations for games from Ubisoft in the past, this is crazily optimized for such a young generation of consoles. This is really about to define the next-generation unlike any other game beforehand.
We are bound from the CPU because of AI. Around 50% of the CPU is used for the pre-packaged rendering parts..
So, if we take this anonymous developers information as true, and this whole story is based on that assumption, then have learned some interesting things.
- The PS4, the more graphically powerful of the two very similarly designed consoles, was not able to maintain a 30 FPS target when rendering at 1920×1080 resolution with Assassin's Creed Unity.
- The Xbox One (after giving developers access to more compute cycles previously reserved to Kinect) is within a 1-2 FPS mark of the PS4.
- The Ubisoft team see Unity as being "crazily optimized" for the architecture and consoles even as we just now approach the 1 year anniversary of their release.
- Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc.
It would appear that just as many in the media declared when the specifications for the new consoles were announced, the hardware inside the Playstation 4 and Xbox One undershoots the needs of game developers to truly build "next-generation" games. If, as this developer states, we are less than a year into the life cycle of hardware that was planned for an 8-10 year window and we have reached performance limits, that's a bad sign for game developers that really want to create exciting gaming worlds. Keep in mind that this time around the hardware isn't custom built cores or using a Cell architecture – we are talking about very basic x86 cores and traditional GPU hardware that ALL software developers are intimately familiar with. It does not surprise me one bit that we have seen more advanced development teams hit peak performance.
If the PS4, the slightly more powerful console of the pair, is unable to render reliably at 1080p with a 30 FPS target, then unless the Ubisoft team are completely off the rocker in terms of development capability, the advancement of gaming on consoles would appear to be somewhat limited. Remember the specifications for these two consoles:
PlayStation 4 | Xbox One | |
---|---|---|
Processor | 8-core Jaguar APU | 8-core Jaguar APU |
Motherboard | Custom | Custom |
Memory | 8GB GDDR5 | 8GB DDR3 |
Graphics Card | 1152 Stream Unit APU | 768 Stream Unit APU |
Peak Compute | 1,840 GFLOPS | 1,310 GFLOPS |
The custom built parts from AMD both feature an 8-core Jaguar x86 architecture and either 768 or 1152 stream processors. The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%. Bay Trail is powering lots of super low cost tablets today and even the $179 ECS LIVA palm-sized mini-PC we reviewed this week. And the 1152/768 stream processors in the GPU portion of the AMD APU provide some punch, but a Radeon HD 7790 (now called the R7 260X), released in March of 2013, provides more performance than the PS4 and the Radeon R7 250X is faster than what resides in the Xbox One.
If you were to ask me today what kind of performance would be required from AMD's current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200).
Also note that if the developer is using 50% of the CPU resources for rendering computation and the remaining 50% isn't able to hold up its duties on AI, etc., we likely have hit performance walls on the x86 cores as well.
Even if this developer quote is 100% correct that doesn't mean that the current generation of consoles is completely doomed. Microsoft has already stated that DirectX 12, focused on performance efficiency of current generation hardware, will be coming to the Xbox One and that could mean additional performance gains for developers. The PS4 will likely have access to OpenGL Next that is due in the future. And of course, it's also possible that this developer is just wrong and there is plenty of headroom left in the hardware for games to take advantage of.
But honestly, based on my experience with these GPU and CPU cores, I don't think that's the case. If you look at screenshots of Assassin's Creed Unity and then look at the minimum and recommended specifications for the game on the PC, there is huge, enormous discrepancy. Are the developers just writing lazy code and not truly optimizing for the hardware? It seems unlikely that a company the size of Ubisoft would choose this route on purpose, creating a console game that runs in a less-than-ideal state while also struggling on the PC version. Remember, there is almost no "porting" going on here: the Xbox One and Playstation 4 share the same architecture as the PC now.
Of course, we might just be treading through known waters. I know we are a bit biased, and so is our reader base, but I am curious: do you think MS and Sony have put themselves in a hole with their shortsighted hardware selections?
UPDATE: It would appear that a lot of readers and commentors take our editorial on the state of the PS4 and XB1 as a direct attack on AMD and its APU design. That isn't really the case - regardless of what vendors' hardware is inside the consoles, had Microsoft and Sony still targeted the same performance levels, we would be in the exact same situation. An Intel + NVIDIA hardware combination could just have easily been built to the same peak theoretical compute levels and would have hit the same performance wall just as quickly. MS and Sony could have prevented this by using higher performance hardware, selling the consoles at a loss out the gate and preparing each platform for the next 7-10 years properly. And again, the console manufacturers could have done that with higher end AMD hardware, Intel hardware or NVIDIA hardware. The state of the console performance war is truly hardware agnostic.
Isn’t this what ran through
Isn’t this what ran through your minds the second their hardware configurations were confirmed?
I basically fell over laughing.
Everyone kept assuming it’d be a AMD APU.
Never did I assume it wouldn’t even be the flagship chip of that line up.
Instantly knew that was one of the last nails in that coffin.
Now you can just play Sony games on their new TV’s.. (The last nail struck in denial?)
It would make more sense to buy a sony TV and just stream the new AC game (assuming they’d ALLOW you to, but you know.. Still trying to whore these sub-par console systems.) and it’d likely be rendered in an epic high end GPU farm and actually giving you back 60fps.
HAAAAAAAAA!!!
Their game streaming service can provide a better gaming experience (good connection assumed) than their dedicated console.
Sad.
Let me get this straight,Xbox
Let me get this straight,Xbox is maxed out because?another,the GAME that made Xbox something even with all the issue original had ,halo alone saved Xbox so many time its insane,and suddenly Dev are feeling disgruntled at Xbox one so Xbox one is maxed?lol,quad channel ddr3 doesn’t grow on trees ,yet ms bothered putting it ,even Intel doesn’t have many server with this and its meant for insane numbers,its so new most don’t know what to do with it,its deemed useless.ms even went to the trouble of implementing a form of azure donnybrook.and Dev think the console is maxed?lol!Xbox is made from the ground up for cloud,it isn’t there to calculate,azure and donnybrook on the cloud is there for that.think of Xbox as a xperia z3 is for ps4 and you get the idea
With a difference,azure
With a difference,azure doesn’t send stream it send the prechewed number likely compressed and decompressed on Xbox.sadly ms has probably not shown this yet since the savior is meant to be halo.TILL HALO IS OUT NOTHING IS AS IT SEEMS.
Put another way?if ms idea
Put another way?if ms idea didn’t have so many feature they could hook directly to your TV via internet,yep the Xbox is just a fetch and mix,if you render from Xbox?you aren’t using Xbox in the way ms is clouding everything.and no ms isnt streaming the game to Xbox lol.just the data calculated
they are using APU’s: Sony
they are using APU’s: Sony did have the initial lead out of the gate by having DDR5 and slight better specs and not mussing about with junk like MS. but in the end – they are using low low cost solutions.
to believe high settings at 50 fps 1080 p for dx11 games is not really possible given the loadout. even with offsetting a lot of the power needed onto server workload and such.
and in consoles you cannot give someone the choice to set graphic settings.
i am actually expecting this generation of consoles to be the first to get a hardware upgrade. so wait another year or two and see what happens.
on the other hand- at least they did not rehire all those DX9 programmers and go back to coding in DX9 again…..
Its so laughable to see
Its so laughable to see Ryan’s update to the article where he in a single sentence claims how the console makers could have “properly” set up their systems for a long life cycle. Cause I’m sure Ryan an the crew know so much better how to design, build, market, sell, and support over years a console better than Sony does within a proper budget.
The arrogance is amazing…
PS the claims in the article of the PS4’s GPU being less than even a 7790 is a flat out lie too, or just complete ignorance at being able to use google to look up specs.
@previous poster!no reviewer
@previous poster!no reviewer are not exactly wrong per say with the info we have today,this being said,the issue Dev have (tho they. All likely never gonna say it,is this:they do not want to rely on ms solution to do their game or when they Dev their game they were already too far done with those game to be able to adopt ms way with Xbox one.but I will say this again ,till customer have experienced the halo meant for Xbox one .nothing is as it seems.ms isn’t gonna talk since halo isn’t ready ,and no Mather how much everybody troll them but remember this:HALO IS NOT OUT!nobody know how Xbox will perform.and certainly not the Dev.and if they do ?they chose to ignore ms (as usual with the result people experience now.)
oh one more thing guys- tone
oh one more thing guys- tone it down a bit- we are beginning to sound like the rabid fan boys we abhor!
Ryan and the guys just gave their opinion, like it or not it is based on hardware knowledge. sure there is still the wiggle room of coding- but Sony is having issues with Drive Club- it is beginning to look like the social experience it expected cannot be met with the hardware.
i think you will still get good games- but when the same IP hit’s pc we will get better settings, and either the new consoles will get a hardware refresh early – or a new model will come out earlier.
people are not willing to wait around for another 10 years for upgrades…..
Is it possible (for Xbox
Is it possible (for Xbox one)that all Dev (including halo team)miss understood Xbox team?is it possible that the Xbox is like how we work !example,most in companies have low capability screen ,its almost brainless,everything is sent from the corp server .is it possible the Xbox one is the same thing(just a more powerful one)if its the case ,all Dev would have coded wrongly from the start,right!I am sure if ubisoft jumped the gun,they aren’t limly to trash the start ,they all adapt to the size and do it properly at next project!if I am right?wouldn’t the Dev pull every hand break and yell at ms ,given it is a way different way of coding then what debate used to be it on PC or ps4(Sony couldn’t adopt this since they don’t have enough band with on the cloud side.is it possible the Xbox is just a 2013 version of a brainless fetcher every corp had in the 80s or 90s!if I am right?this would be overkill right!in this manner,ultrahd is not only thinkable its actually would be possible with hallo 5!is it even possible to prechew everything,yes including the most basic rendering,I mean if all coordinate are pretty rendered,only thing left for GPU is to put it together.
People are pretty mental
People are pretty mental about mantle.
Ms has a tendency to put big
Ms has a tendency to put big delay on they is (typicly 200 to 400 ms ,if they did the same here?with 3 os !all virtual by the look of it?we re probably talking second of delay here.yet refresh rate is still aimed at 16.7 ms per frame so 1000ms of delay in a place where user need 16.7 ms max of delay .and you get the idea.I doubt ms engineer ,master or doctor even noticed this.if it affect my computer with is in wow.I don’t see how it wouldn’t affect a 3os virtual (hyperv or and azure etc.so ya its fixable. Since its a timing issue,as usual from ms,they keep having timing issue.at time its the universal clock ,at other its hpet when ms should use invariant tsc .and now its Xbox one.sadly this isn’t ms team its Xbox team so I doubt Xbox team can fix this on their own .hopefully satia nadella will notice and send doctor to helphis engineer at Xbox team
PS:also ,data execution
PS:also ,data execution prevention is still a nightmare for the majority of Dev be it on PC or console.(yep old school die hard)and its easy to understand why so many gamer have issue.imagine,you disabled upnp. On your Nat router ,enabled full dep ,etc etc.pretty much set a LA security now?and the gamer expect the Dev to know or have took notice of these security .lol!
TC is a hopeless idiot.
TC is a hopeless idiot. comparing SoCs architecture to a open platform that wasn’t designed for game production. for a programming illiterate airhead who only learned how to put PC parts together, perhaps you don’t understand programming efficiency, thin API layer, customized API platform are the true computing power developers want, how much raw power one has on a high end GPU doesn’t concern developers one bit.
which PC indie exclusive from the last 10 years can even hold up to 360/ps3 standard?
let’s pretend psychically based rendering, superior human skin shader, 10xtimes key-frame in animation data doesn’t matter,sure you can have counter strike lvl graphic on all platform at 4k/60fps.
even trine2 on PS4 is 4k ready, this ???P/???fps idiocy is entirely invented by PC virign community, believed by PC virgins, spread across the internet by PC virgins. most developers don’t have the time and patient to address this fullbloom stupidity. when they do care to tell the truth in couple short statement, and you morons able to twist their words and rewrite them in your own theory. you are the reason this world need birth control.
The whole world is laughing
The whole world is laughing at Ryan, this is what he said
“If you were to ask me today what kind of performance would be required from AMD’s current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200).”
AC: unity’s benchmark
http://www.overclock.net/content/type/61/id/2247446/width/500/height/1000/flags/LL
DA:I’s benchmark.
http://cdn.overclock.net/c/c5/900x900px-LL-c5f143d5_Benchmarks_Dragon_Age_1080p_-_Mantle_-_fixed-pcgh.png
Not only both games completely put these cards out of the picture for above 30fps on respectable setting, but 95% PC virgins around the world won’t even run ACU on lowest setting.
next 5 years?? lol!!
I don’t know. About
I don’t know. About ps4(hopefully Sony did not just copy and paste the way ms program l)for Xbox one tho ,I can tell you Xbox team issue is ms security team and probably left hand not speaking to right hand.google this:MSI/x patent,it there look for the drawback section.MSI/x drawback:latency !what is the number one enemy of gamer?latency !on user. Hardware MSI/x is useless.IRQ (and the fact PCI allow sharing of IRQ is OK for mobile,tablet and PC (yep even if you stream on twitch.you don’t see anybodyt mentioning. This very important info.only. reason its in the patent is likely because its asked.another self created issue on ms side is dmaand dca deactivation(saw it on window 8 and up but I suspects ms imposed this on azure and xbox one team.without DMA ?xb1just cannot do its job,xb1 was optimized by its team witbdma dca in mind so halo5?its not released because of these changes ,Xbox team is likely searching for a non existent alternative.and now you begin to understand.Xbox team got wreaked by ms security team and MSI/x.MSI/x is great everywhere but 1 place:GAMING
Lastly!be it on console or PC
Lastly!be it on console or PC ,performance is no t where it should bewhy msi add latencyton of it compared to IRQ,on top of this ,some guru3d user tweak is (via bios and bcdedit to only run on invariant tsc,OK how you. Gonna get MSI/x?MSI/x work on crappy slow lapic!(Intel decided )yep its a chicken and egg issue google android is likely to find a fix what before Microsoft or Sony ps4