I feel like every few months I get to write more stories focusing on the exact same subject. It's almost as if nothing in the enthusiast market is happening and thus the cycle continues, taking all of us with it on a wild ride of arguments and valuable debates. Late last week I started hearing from some of my Twitter followers that there were concerns surrounding the upcoming release of The Witcher 3: Wild Hunt. Then I found a link to this news post over at Overclock3d.net that put some of the information in perspective.
Essentially, The Witcher 3 uses parts of NVIDIA's GameWorks development tools and APIs, software written by NVIDIA to help game developers take advantage of new technologies and to quickly and easily implement them into games. The problem of course is that GameWorks is written and developed by NVIDIA. That means that optimizations for AMD Radeon hardware are difficult or impossible, depending on who you want to believe. Clearly it doesn't benefit NVIDIA to optimize its software for AMD GPUs financially, though many in the community would like NVIDIA to give a better effort – for the good of said community.
Specifically in regards to The Witcher 3, the game implements NVIDIA HairWorks technology to add realism on many of the creatures of the game world. (Actually, the game includes HairWorks, HBAO+, PhysX, Destruction and Clothing but our current discussion focuses on HairWorks.) All of the marketing and video surrounding The Witcher 3 has been awesome and the realistic animal fur simulation has definitely been a part of it. However, it appears that AMD Radeon GPU users are concerned that performance with HairWorks enabled will suffer.
An example of The Witcher 3: Wild Hunt with HairWorks
One of the game's developers has been quoted as such:
Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations.
There are at least several interpretations of this statement floating around the web. First, and most enflaming, is that NVIDIA is not allowing CD Project Red to optimize it by not offering source code. Another is that CD Project is choosing to not optimize for AMD hardware due to time considerations. The last is that it simply isn't possible to optimize it because of hardware limitations of HairWorks.
I went to NVIDIA with these complaints about HairWorks and Brian Burke gave me this response:
We are not asking game developers do anything unethical.
GameWorks improves the visual quality of games running on GeForce for our customers. It does not impair performance on competing hardware.
Demanding source code access to all our cool technology is an attempt to deflect their performance issues. Giving away your IP, your source code, is uncommon for anyone in the industry, including middleware providers and game developers. Most of the time we optimize games based on binary builds, not source code.
GameWorks licenses follow standard industry practice. GameWorks source code is provided to developers that request it under license, but they can’t redistribute our source code to anyone who does not have a license.
The bottom line is AMD’s tessellation performance is not very good and there is not a lot NVIDIA can/should do about it. Using DX11 tessellation has sound technical reasoning behind it, it helps to keep the GPU memory footprint small so multiple characters can use hair and fur at the same time.
I believe it is a resource issue. NVIDIA spent a lot of artist and engineering resources to help make Witcher 3 better. I would assume that AMD could have done the same thing because our agreements with developers don’t prevent them from working with other IHVs. (See also, Project Cars)
I think gamers want better hair, better fur, better lighting, better shadows and better effects in their games. GameWorks gives them that.
Interesting comments for sure. The essential take away from this is that HairWorks depends heavily on tessellation performance and we have known since the GTX 680 was released that NVIDIA's architecture performs better than AMD's GCN for tessellation – often by a significant amount. NVIDIA developed its middleware to utilize the strength of its own GPU technology and while it's clear that some disagree, not to negatively impact AMD. Did NVIDIA know that would be the case when it was developing the software? Of course it did. Should it have done something to help AMD GPUs more gracefully fall back? Maybe.
Next, I asked Burke directly if claims that NVIDIA was preventing AMD or the game developer from optimizing HairWorks for other GPUs and platforms were true? I was told that both AMD and CD Project had the ability to tune the game, but in different ways. The developer could change the tessellation density based on the specific GPU detected (lower for a Radeon GPU with less tessellation capability, for example) but that would require dedicated engineering from either CD Project or AMD to do. AMD, without access to the source code, should be able to make changes in the driver at the binary level, similar to how most other driver optimizations are built. Burke states that in these instances NVIDIA often sends engineers to work with game developers and that AMD "could have done the same had it chosen to." And again, NVIDIA reiterated that in no way do its agreements with game developers prohibit optimization for AMD GPUs.
It would also be possible for AMD to have pushed for the implementation of TressFX in addition to HairWorks; a similar scenario played out in Grand Theft Auto V where several vendor-specific technologies were included from both NVIDIA and AMD, customized through in-game settings.
NVIDIA has never been accused of being altruistic; it doesn't often create things and then share it with open arms to the rest of the hardware community. But it has to be understood that game developers know this as well – they are not oblivious. CD Project knew that HairWorks performance on AMD would be poor but decided to implement the technology into The Witcher 3 anyway. They were willing to sacrifice performance penalties for some users to improve the experience of others. You can argue that is not the best choice, but at the very least The Witcher 3 will let you disable the HairWorks feature completely, removing it from the performance debate all together.
In a perfect world for consumers, NVIDIA and AMD would walk hand-in-hand through the fields and develop hardware and software in tandem, making sure all users get the best possible experience with all games. But that style of work is only helpful (from a business perspective) for the organization attempting to gain market share, not the one with the lead. NVIDIA doesn't have to do it and chooses to not. If you don't want to support that style, vote with your wallet.
Another similar controversy surrounded the recent release of Project Cars. AMD GPU performance was significantly lower than comparable NVIDIA GPUs, even though this game does not implement any GameWorks technologies. In that case, the game's developer directly blamed AMD's drivers, saying that it was a lack of reaching out from AMD that caused the issues. AMD has since recanted its stance that the performance delta was "deliberate" and says a pending driver update will address gamers performance issues.
All arguing aside, this game looks amazing. Can we all agree on that?
The only conclusion I can come to from all of this is that if you don't like what NVIDIA is doing, that's your right – and you aren't necessarily wrong. There will be plenty of readers that see the comments made by NVIDIA above and continue to believe that they are being at best disingenuous and at worst, are straight up lying. As I mentioned above in my own comments NVIDIA is still a for-profit company that is responsible to shareholders for profit and growth. And in today's world that sometimes means working against other companies than with them, resulting in impressive new technologies for its customers and push back from competitor's customers. It's not fun, but that's how it works today.
Fans of AMD will point to G-Sync, GameWorks, CUDA, PhysX, FCAT and even SLI as indications of NVIDIA's negative impact on open PC gaming. I would argue that more users would look at that list and see improvements to PC gaming, progress that helps make gaming on a computer so much better than gaming on a console. The truth likely rests somewhere in the middle; there will always be those individuals that immediately side with one company or the other. But it's the much larger group in the middle, that shows no corporate allegiance and instead just wants to have as much fun as possible with gaming, that will impact NVIDIA and AMD the most.
So, since I know it will happen anyway, use the comments page below to vent your opinion. But, for the benefit of us all, try to keep it civil!
As long as nvidia have the
As long as nvidia have the majority of the marketshare they won’t stop. It’s time that AMD also do something like Gameworks and make it only work on AMD then get some AAA game developer use it and give them free stuff like graphics cards so they can test or so on.
People complain so much about gameworks and that Nvidia is bsing the gaming market but in the end they still buy Nvidia cards. Like I said AMD needs to do their own thing and get huge developers to use their cards and technology. If done right I can see many people shifting to AMD. I will move to AMD as well if many new games comes out with gameworks. I also have the feeling once we get DX12 games thing might turn around. I don’t care about gameworks much and will be happy to pay less for a card and having to turn features off. Once they implement dx12 I think this game will fly on AMD cards.
only that AMD can’t afford to
only that AMD can’t afford to do so. and going proprietary like nvidia doesn’t guarantee them success either. for example before pushing for OpenCL they have Stream which supposed to compete directly with nvidia CUDA. it goes nowhere. Mantle probably another much more recent example. last year Richard Huddy mentions that Mantle will continue to exist along side DirectX and OpenGL and will end always end up better than both API since Mantle will be tweaked specifically for AMD GPU in one of his interview tour. in the end AMD canned Mantle and even asked developer to look up for DX12 and Vulcan instead. where were those 100 (almost) developer that signed up for Mantle they mention late last year?
So then I think it’s because
So then I think it’s because AMD lives in this dream world. They say or promise one thing and we don’t know if we should believe them or not. So all in all AMD did this to themselves.
So then I think it’s because
So then I think it’s because AMD lives in this dream world. They say or promise one thing and we don’t know if we should believe them or not. So all in all AMD did this to themselves.
Don’t need many more reasons
Don’t need many more reasons to avoid the game at present.
Patches galore on day 1 and video is apparently broken with the latest patch.
Also it seems to be a glorified console port and hence i’ll pass this by. The images from 2013 to 2015 are hugely different and make it easy to spot the downgrade. Even knowing i’m playing something that is just mildly better than the ps4 version would annoy me – i can’t support developers with the lowest common denominator attitude to development.
I just hope a witcher 3 enhanced comes out at some stage – the way it should have been done if the xbone and ps4 didn’t exist, might be worth picking up then. Plenty to play until then.
I dont understand the fuss
I dont understand the fuss here.Mainly because,there are Amd optimizrd games and then there are Nvidia ,based games.ex -Hitman Absolution run like crapp in Nvidia cards,another ex is Dirt showdown was an uttershit for Nvidia based Gpu’s.So basically it depends upon which game are optimised for which Gpu.But I am certainly hating Nvidia,cause they have completed neglected their Kepler Gpus,hell the gtx 750ti is beating the gtx 660 and gtx 660ti is newer games.Even the r9 265 is beating the old gtx 600 keplers card,let alone an hd 7870 is almost equal in performance with the gtx 670 in newer games.So yah next time i will go for AMD.
I will say with an I7
I will say with an I7 4770k@4ghz and Sapphire Vapor-X R9 290, at 1080p I get 55-60FPS with everything Ultra except for hairworks. With Hairworks I get around 30-35fps.
Game looks so great but I thought tesselation, lighting, and texture resolution would be a bit higher. Only uses 2.2gb of vram also!
Just do what i did, Sell yuor
Just do what i did, Sell yuor AMD GPU and Buy a Nvidia GPU.
Don’t blame Nvidia.
Don’t blame Nvidia.
http://img11.hostingpics.net/pics/704586TheWitcher3Analysis.png
It’s a architecture limitation not a none optimisations of Nvidia drivers. Same issues with FC4. This is why Maxwell is more powerfull in Pixelfillrate and less in Texturefillrate. Actually around 100 Gtexels is enough for now…
If the GTX 780 is behind or ahead the GTX 960 it’s because the GTX 780 have multiple configuration(32, 36 or 40 pixels by cycle) so the fillrate can be worse, equal or better. That’s random. If you have performances issues with GTX 780 in TW3 or FC4. This is the reason.
Claims that Nvidia
Claims that Nvidia intentionally cripples performance on its older generation cards:
https://forums.geforce.com/default/topic/806331/nvidia-intentionally-cri…
ludiqpich198 wrote:
“Before driver/geforce experience update:
http://i.imgur.com/w0PhEeQ.png
After:
http://i.imgur.com/o4vCtEj.png
gpu lost on
directx 9 simple -18 percent performance
directx 10 -23 percent performance
directx 11 -8 percent performance
a whole average 16 percent performance have been lost just so you can show your customers how much better your new generation is
STOP IT NVIDIA STOP CRIPPLING YOUR OLDER GENERATIONS STOP”
GTX 980M is based on a
GTX 980M is based on a Maxwell gpu(GM204), there is no reason for Nvidia for cripple the actual generation. It’s just a bug.
The GTX 880M compared to the
The GTX 880M compared to the GTX 980M, is his or her focus. The GTX 880M losing performance, for no good reason, is suspicious, is possible damning! Yes, no?
ludiqpich198 wrote:
“Before driver/geforce experience update:
http://i.imgur.com/w0PhEeQ.png
After:
http://i.imgur.com/o4vCtEj.png
I will keep it as civil as
I will keep it as civil as nVIDIA does everyday…
I will never EVER buy nVIDIA hardware… even if AMD one day goes out of business I will play games with Intel GPUs or stop playing games, period.
Regarding Witcher 3: Wild Hunt… I will buy it when the price is 5 or 10 Euros… so F… You CD Project !
Regarding nVIDIA brain damaged fanboys… your dear nVIDIA is turning “PC gaming” in “CONSOLE gaming”… we are very close to having 5 major consoles in the market: Nintendo Wii U, Sony PlayStation 4, Microsoft Xbox One, PC nVIDIA and PC AMD…
So F… You nVIDIA !!!
Ryan,
Any intention to do an
Ryan,
Any intention to do an analysis on the kepler GPU’s as to why they are performing so badly in the witcher 3 compared to Maxwell cards that are considered low end (960)??
I’d certainly like to see some investigation here because something does seem to be a miss.
I keep hearing that AMD users
I keep hearing that AMD users shouldn’t pay as much as nVidia users for the game because they can’t effectively use hairworks. How it is unfair that nVidia users didn’t pay more. Well you know what? They did, nVidia paid it for them for it to be included in the game. There was nothing taken out because hairworks was put in. The money it took to implement it was supplied by the people they were implementing it for. So I don’t really see where the “but we are paying the same amount and not getting the same amount” comes into play.
Wow, fanboys certainly are
Wow, fanboys certainly are passionate about their graphics….. you AMD guys need to quick thinking AMD deserves a free ride.
Nvidia are deliberately
Nvidia are deliberately crippling performance on their own 700 series to push maxwell. This has been happening since the 980/970 launched. If you remember correctly the 980 performed slightly above the 780 ti in every game released up to that point, yet every new game released thereafter has performed very well on the 980 while very poorly on the 780 ti. Even the 290x regularly performs slightly below the 980 while greatly outperforming the 780 ti in new games. Nvidia have dropped support for kepler owners while amd are still supporting all the way back to the 7970, allowing it to outperform the 780 in many new games. I own a titan x, while I’m happy with the performance I’m considering selling it on ebay. I don’t want to be in the same position as the 700 series owners when my gpu is a year old.
Here is the video
Here is the video http://youtu.be/qhaJJsNMbK4 , when the camera zoom in near Geralt’s hair the fps drop lot.it makes the FPS very unstable,and bad gaming experience whole the time…it should not happen on GTX 980 SLI in just 1920×1080 resolution right?
How suck this damn Nvidia HairWork skill bundle with it’s GPU card?
Or Nvidia want me to buy 2 more GTX 980 for 4 Way SLI ?
oh guess they will tell me not 4 way 980 ,go buy our Titan X for 4!