A couple of weeks back a developer on Ubisoft's Assassin's Creed Unity was quoted that the team had decided to run both the Xbox One and the Playstation 4 variants of the game at 1600×900 resolution "to avoid all the debates and stuff." Of course, the Internet exploded in a collection of theories about why that would be the case: were they paid off by Microsoft?
For those of us that focus more on the world of PC gaming, however, the following week an email into the Giantbomb.com weekly podcast from an anonymous (but seemingly reliable) developer on the Unity team raised even more interesting material. In this email, despite addressing other issues on the value of pixel count and the stunning visuals of the game, the developer asserted that we may have already peaked on the graphical compute capability of these two new gaming consoles. Here is a portion of the information:
The PS4 couldn’t do 1080p 30fps for our game, whatever people, or Sony and Microsoft say. …With all the concessions from Microsoft, backing out of CPU reservations not once, but twice, you’re looking at about a 1-2 FPS difference between the two consoles.
What's hard is not getting the game to render but getting everything else in the game at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed a lot of optimizations for games from Ubisoft in the past, this is crazily optimized for such a young generation of consoles. This is really about to define the next-generation unlike any other game beforehand.
We are bound from the CPU because of AI. Around 50% of the CPU is used for the pre-packaged rendering parts..
So, if we take this anonymous developers information as true, and this whole story is based on that assumption, then have learned some interesting things.
- The PS4, the more graphically powerful of the two very similarly designed consoles, was not able to maintain a 30 FPS target when rendering at 1920×1080 resolution with Assassin's Creed Unity.
- The Xbox One (after giving developers access to more compute cycles previously reserved to Kinect) is within a 1-2 FPS mark of the PS4.
- The Ubisoft team see Unity as being "crazily optimized" for the architecture and consoles even as we just now approach the 1 year anniversary of their release.
- Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc.
It would appear that just as many in the media declared when the specifications for the new consoles were announced, the hardware inside the Playstation 4 and Xbox One undershoots the needs of game developers to truly build "next-generation" games. If, as this developer states, we are less than a year into the life cycle of hardware that was planned for an 8-10 year window and we have reached performance limits, that's a bad sign for game developers that really want to create exciting gaming worlds. Keep in mind that this time around the hardware isn't custom built cores or using a Cell architecture – we are talking about very basic x86 cores and traditional GPU hardware that ALL software developers are intimately familiar with. It does not surprise me one bit that we have seen more advanced development teams hit peak performance.
If the PS4, the slightly more powerful console of the pair, is unable to render reliably at 1080p with a 30 FPS target, then unless the Ubisoft team are completely off the rocker in terms of development capability, the advancement of gaming on consoles would appear to be somewhat limited. Remember the specifications for these two consoles:
PlayStation 4 | Xbox One | |
---|---|---|
Processor | 8-core Jaguar APU | 8-core Jaguar APU |
Motherboard | Custom | Custom |
Memory | 8GB GDDR5 | 8GB DDR3 |
Graphics Card | 1152 Stream Unit APU | 768 Stream Unit APU |
Peak Compute | 1,840 GFLOPS | 1,310 GFLOPS |
The custom built parts from AMD both feature an 8-core Jaguar x86 architecture and either 768 or 1152 stream processors. The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%. Bay Trail is powering lots of super low cost tablets today and even the $179 ECS LIVA palm-sized mini-PC we reviewed this week. And the 1152/768 stream processors in the GPU portion of the AMD APU provide some punch, but a Radeon HD 7790 (now called the R7 260X), released in March of 2013, provides more performance than the PS4 and the Radeon R7 250X is faster than what resides in the Xbox One.
If you were to ask me today what kind of performance would be required from AMD's current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200).
Also note that if the developer is using 50% of the CPU resources for rendering computation and the remaining 50% isn't able to hold up its duties on AI, etc., we likely have hit performance walls on the x86 cores as well.
Even if this developer quote is 100% correct that doesn't mean that the current generation of consoles is completely doomed. Microsoft has already stated that DirectX 12, focused on performance efficiency of current generation hardware, will be coming to the Xbox One and that could mean additional performance gains for developers. The PS4 will likely have access to OpenGL Next that is due in the future. And of course, it's also possible that this developer is just wrong and there is plenty of headroom left in the hardware for games to take advantage of.
But honestly, based on my experience with these GPU and CPU cores, I don't think that's the case. If you look at screenshots of Assassin's Creed Unity and then look at the minimum and recommended specifications for the game on the PC, there is huge, enormous discrepancy. Are the developers just writing lazy code and not truly optimizing for the hardware? It seems unlikely that a company the size of Ubisoft would choose this route on purpose, creating a console game that runs in a less-than-ideal state while also struggling on the PC version. Remember, there is almost no "porting" going on here: the Xbox One and Playstation 4 share the same architecture as the PC now.
Of course, we might just be treading through known waters. I know we are a bit biased, and so is our reader base, but I am curious: do you think MS and Sony have put themselves in a hole with their shortsighted hardware selections?
UPDATE: It would appear that a lot of readers and commentors take our editorial on the state of the PS4 and XB1 as a direct attack on AMD and its APU design. That isn't really the case - regardless of what vendors' hardware is inside the consoles, had Microsoft and Sony still targeted the same performance levels, we would be in the exact same situation. An Intel + NVIDIA hardware combination could just have easily been built to the same peak theoretical compute levels and would have hit the same performance wall just as quickly. MS and Sony could have prevented this by using higher performance hardware, selling the consoles at a loss out the gate and preparing each platform for the next 7-10 years properly. And again, the console manufacturers could have done that with higher end AMD hardware, Intel hardware or NVIDIA hardware. The state of the console performance war is truly hardware agnostic.
Console are hitting a wall
Console are hitting a wall and PC gamers are paying the price.
Every point from every angle
Every point from every angle has already been made, EXCEPT the simple fact that the greatest gaming experience of all time, OF ALL TIME was the Atari 2600 combined with lots of drugs. No matter how impressive the systems are, no matter how hard the developers work, no one will ever again hit the excitement and REALISM of mid 80’s tripping balls Galaxian. And that is simply a fact!
🙂
Maybe they should have used
Maybe they should have used intel instead of a lackluster AMD CPU
Maybe devs should get thumb out their asses and make/use more optimized engines.
Maybe Xbone should get devs to use mantle api instead of that bottlneck DX11.
Maybe people should stop buying these shit games then devs would stop pushing more shit.
The solution is that
The solution is that Microsoft needs to stop moneyhatting developers and publishers in order to make their console special or to make the Xbox One version equal in performance to the PS4. They should rather take that money and put it into their first-party studios and develop better first-party games.
But will they, even with Phil Spencer at the helm? No. Why? Because Spencer was in charge of their first-party developers for years. He doesn’t care as much as he professes to AND he was the last person who could have stopped Microsoft from turning Rare into a Kinect-centered studio.
Also, the CPU is fine and the Xbox won’t ever use Mantle. What’s the point, it has a DX12-like API already!
Damn, Tim Sweeney was right
Damn, Tim Sweeney was right when he said that if game consoles cannot hit 2.5 or 3 Teraflops they would not be truly NEXT GEN.
However for this generation the CPU is the biggest bottleneck. Too bad Intel doesn’t licence it’s Core architecture.
wouldn’t it be better to
wouldn’t it be better to spend money and just build a pc for same price as these 2 councils?I don’t know what the debate here is they both run on basically pc hardware !!!!!and no one brought up fact that the x box one with connect hooked up uses half cpu and gpu cycles when in use! besides you can do more on pc !i dont know anyone saying in my circle of nerds oh got to boot up my council to check my e mail!
WHERES THE STEAM BOXES THEY
WHERES THE STEAM BOXES THEY DO SAME THING DONT THEY LOL
http://www.maximumpc.com/no_b
http://www.maximumpc.com/no_bs_podcast_226_-depth_interview_amd_graphics_guru_richard_huddy
no one watched this ???
I think it’s way too early to
I think it’s way too early to declare the consoles as tapped out. Even if we take ubisoft at their word that the game is super well optimized by the known standards – that’s only a barrier if you accept that the contemporary optimization techniques are as good as they’ll ever be. To their knowledge maybe it’s as optimized as possible – but that assumes there’s nothing left to be discovered. If anything, being forced up against a wall like this will only accelerate the pace of research into squeezing every last drop out of it using novel techniques.
For instance, think about how much performance AA had to burn when all they had was MSAA, and then along came FXAA and cut that burden *dramatically*. Maybe it’s not every bit as good as MSAA, but you get a lot more relative quality vs performance hit for FXAA. I don’t believe every last trick has already been discovered, and it’s not like they’re writing hand coded assembly for games anymore either.. There’s still lots of room left to grow and avenues to explore. And maybe, just maybe some of that will filter down to PC.
This is excellent news.
Devs
This is excellent news.
Devs have to learn how to code now. There is no excape. PC gamers will benefit the most from this in the end.
The CPU takes too much of the cash today to build a computer only meant for gaming. It should not be like that.
Before they had to tweak for
Before they had to tweak for PowerPC hardware and stuff that could not translate into the PC.
Now they have to tweak X86 based stuff to the max which can translate directly into the PC. Things could not be better 🙂
I am a bit more pessimistic
I am a bit more pessimistic on this. I feel they are now only going to shoot for the “max” they have hit on consoles, and regardless of the common pc architecture, PC games are not going to get much more then the console max they are shooting for
Another developer from
Another developer from Ubisoft making excuses for them? Please, Ubisoft, don’t insult my intelligence. First you say that females are hard to animate, then you say that English voice actors with English accents are more appropriate for the time period, then they go on to say that when it comes to Unity, there’s only a 1-2fps difference between consoles when they’ve had the time to work on ironing out the 50% CPU workload issue. They should be using GPGPU for this, not dragging out the rendering process on a weaker processor.
As a reminder, Unity is made by a team of SEVEN Ubisoft studios pooling resources together to make what is a soulless four-person co-op Assassin’s Creed game that is capped, hard-limited, on the PC to 30fps no matter what hardware you’re running. This is the same developer/publisher that brought The Crew on PC down to 30fps because the physics is tied to the rendering engine (see; Skyrim, NFS Rivals). These are the same people who said 1080p 60fps on the PS4 with Black Flag couldn’t be done due to “optimisation considerations” (read: Xbox One parity) and later released a patch to allow the PS4 version to run at 1080p 30 fopr the main storyline and 60fps for the multiplayer once the launch was over and Microsoft got what they wanted.
I haven’t trusted anything Ubisoft claims for a while and neither should you. It is in their best interests to keep Microsoft happy (can’t market the games the same way if they’re not going to play the same, can they?) and they will lie through their teeth to make sure that their game sells well.
Lol!for all the ms Xbox
Lol!for all the ms Xbox naysayers .tell them this.go view a bit of ms r&d content .I suspect the issue isn’t it can’t be done,but most Dev probably hate having to prechew everything in azure (whatever ms came with)then send the preshewed data to Xbox .yep this means Dev would likely need a license to do this.I suspect not one Dev want to do it that way ,so they make do .but Xbox is optimized to have everything done on server and then just send final data on console.in this manner it would take very little resource but either Dev don’t want to do it that way or ms hasn’t released this yet (halo)
So many comments for
So many comments for something so small. people really get passionate about brand names.
FUD. FUD EVERYWHERE. STOP
FUD. FUD EVERYWHERE. STOP LISTENING. STOP BELIEVING IN THIS UBICRAP GARBAGE. JUST. STOP. IT’S ALL UTTER LIES AND SHEER AUTISM.
Another idiotic article from
Another idiotic article from a low IQ PC virgin with 0 hour training in data mining and QA pretending as if he understands how complex algorithm works on different type hardware architecture and API environment.
it’s like saying a high end sports car can carry more cargo than a freight truck because his reference is their engine horse power. Try to get involve in a simplest development in a 3D application, especially in windows/DriectX enviorment, see how many layers it has between hardware and your middle ware and see how hard is to make things working in an acceptable frame rate and resolution. of course, unless you are already somewhat knowledge in C++/Objective C etc.
All brain bead PC virgins made it sound like they can can replace PHD coding veteran anytime any min, calling developers lazy because they have no time and patient to address this “bla a console speced PC can handle it” idiocy which is invented from internet by PC only gamers, hilarious.
Most indie games released recently can be handled at 1080p/1440p/4k and beyond on any mid-high end rig today, and only few of them even hold up to 10 year old 360/ps3 standard.
Video game production needs to find a sweet spot between resolution and fps because they can put more details in their games, not because this “wall”, if anything, what stop you from gaming counter strike at 8k/120fps instead of BF4?
I think you need to take
I think you need to take English classes before you start abusing people. Perhaps then they might understand your point.
I think you need to take an
I think you need to take an IQ test before I can take you seriously.
I have, it was 145.
I have, it was 145.
“I have, it was 45”
I fixed
“I have, it was 45”
I fixed your typo for you.
Next.
I think you need to take an
I think you need to take an IQ test before I can take you seriously.
Honestly guys we hear this
Honestly guys we hear this story every new generation of consoles. There are two things to say:
1 – Every mainstream console that has been released in the past 20+ years or so has seen a significant improvement for the first 5 years of its life.
2 – Although graphics still have a long way to achieve ‘real life’ they are certainly good enough for entertainment as it always boils down to a great game.
Oh and that Uncharted 4 treat at E3. Wow!
I consider a SNES less dated
I consider a SNES less dated and more modernly relevant than a PS4.
You can’t really recreate that SNES raw experience through emulation, it’s truly beautiful, artistic and unique. Music quality unmatched in modern games as well.
What do you get with a PS4?
The same thing you do on PC, just shittier.
Just, crossed my mind as I read your comment.
I think all the platforms
I think all the platforms have their awesome factors. Really great video games are sucha rare thing that one must embrace many platforms to feed that on going lust for excellent games.
Consoles have certain exclusives, PC do to and once in a while they get a technically advanced game that goes down in the history books. Portable machines are having a hard time of it except for the 3DS.
All in all I buy these technologies (fortunate to have Xone, PS4, PS3, PS Vita, 3DS, PC, iDevices and Wii U) to play the games I want to.
I love games and the technology that powers them. Heres hoping that Holleywood talent comes to gaming (in mass numbers) sooner than later but I’m not holding my breath.
I think all the platforms
I think all the platforms have their awesome factors. Really great video games are sucha rare thing that one must embrace many platforms to feed that on going lust for excellent games.
Consoles have certain exclusives, PC do to and once in a while they get a technically advanced game that goes down in the history books. Portable machines are having a hard time of it except for the 3DS.
All in all I buy these technologies (fortunate to have Xone, PS4, PS3, PS Vita, 3DS, PC, iDevices and Wii U) to play the games I want to.
I love games and the technology that powers them. Heres hoping that Holleywood talent comes to gaming (in mass numbers) sooner than later but I’m not holding my breath.
I consider a SNES less dated
I consider a SNES less dated and more modernly relevant than a PS4.
You can’t really recreate that SNES raw experience through emulation, it’s truly beautiful, artistic and unique. Music quality unmatched in modern games as well. Call it timeless.
What do you get with a PS4?
The same thing you do on PC, just shittier.
ALREADY DATED AND SERVES NO USE OTHER THAN MARKET MONOPOLY.
Just, crossed my mind as I read your comment.
I consider a SNES less dated
I consider a SNES less dated and more modernly relevant than a PS4.
You can’t really recreate that SNES raw experience through emulation properly no matter what, it’s truly beautiful, artistic and unique. Music quality unmatched in modern games as well. Call it timeless.
What do you get with a PS4?
The same thing you do on PC, just shittier. (Not just graphically)
ALREADY DATED BY DEFAULT AND SERVES NO USE OTHER THAN MARKET MONOPOLY.
Just, crossed my mind as I read your comment.
I consider a SNES less dated
I consider a SNES less dated and more modernly relevant than a PS4.
You can’t really recreate that SNES raw experience through emulation, it’s truly beautiful, artistic and unique. Music quality unmatched in modern games as well. Call it timeless.
What do you get with a PS4?
The same thing you do on PC, just shittier.
ALREADY DATED AND SERVES NO USE OTHER THAN MARKET MONOPOLY.
Just, crossed my mind as I read your comment.
I remember being laughed at
I remember being laughed at when I predicted this console generation would be heavily CPU bound. Going forward, this WILL affect PC gaming, as devs do not like doing PC exclusive features, as consoles have become their cash cows. I predict several upcoming games are going to show signs of being scaled back late in development, and be a lot less then what is predicted.
One game I expect this to happen to is Witcher 3. CJPR came out VERY early complaining about the lack of power in consoles, and there have been a few roumers the project is running into problems. I hope I’m wrong; I love the series. But signs point this as being a game prime for underachieving.
In any case, when you look at the decline of independent studios, the growing reliance on mega-games to generate revenue, and what appears to be a VERY weak console generation, we’re primed for a major 1983 style contraction of the industry.
Given the reported minimum
Given the reported minimum specs for ACU on the PC, I place the blame solely on Ubisoft being lousy programmers.
“I have, it was 45”
I fixed
“I have, it was 45”
I fixed your typo for you.
Next.
There is a huge performance
There is a huge performance Divide between the XBOne and PS4 in regards to the GPUs used and the memory they are tied to. I’m surprised you continue to push the narrative that the PS4 is “only a little more powerful than the XBOne” when you’ve had first hand interaction with the GPU’s they are based on.
The PS4’s GPU itself is closest to the Radeon 7850/R7-265. The PS4 version has 128 more shader cores than the 7850/265, but is also clocked slower. What helps keep the chip strong is the fact that it uses the memory native to the Radeon, GDDR5. The CPU may be underpowered in the machine, but the GPU in the PS4 is up to the task.
The XBOne GPU, on the surface, is a dead ringer for the Radeon 7790/R7-260x. Same amount of shader cores, clocked a little slower, but the GPU itself is still basically a Radeon 7790/R7-260x. HOWEVER, the memory used is DDR3, as opposed to the GPUs native GDDR5. The use of DDR3, is what collapses the performance of the GPU to Radeon 7750/R7-250 levels of performance.
Basically, being a computer hardware site, just answer this. Would Radeon 7790/R7-260X using DDR3 memory as opposed to GDDR5 memory perform close to a standard Radeon 7850/R7-265?
Truth is, it wouldn’t be even close, probably 50% of the performance. I implore you and all other hardware/gaming sites to stop pushing the false narrative that the PS4 is a little more powerful than the XBOne, when MATH and LOGIC say the polar opposite. The performance difference between the XBOne and PS4, is about the same as the difference betweent the Wii-U and the XBOne.
It is also openly known that MS and Ubisoft have a co-marketing campaign with AssCreed Unity. They really had to limit the PS4 version to 900p, or look like assholes to MS.
Need to switch from x86 to
Need to switch from x86 to VISC CPU, this would help drastically with CPU limitations.
How would that even help.
How would that even help. You’ll still have lousy CPU utilization. 99% on 1 core, 70% on 2, 40% on 4 cores.
Both X-Box and PS4 have 8 cores.
You’ll just be moving lousy programing from one CPU arch to the next.