AMD: Nvidia "Completely Sabotaged" Witcher 3 Performance on Our Cards

Recommended Videos

Pinky's Brain

New member
Mar 2, 2011
290
0
0
J Tyran said:
AMD throwing a strop again... This excuse is weak and old, even if it has some basis in truth should developers and consumers have to be limited by hardware with a minority share?
How do you mean? It's not like Hairworks is exactly revolutionary ... its AA is archaic, its performance profile across multiple architectures is pathetic. Oh and it actively constrains CDPR in developing its game because it's at the same time closed sourced AND intimately tied into the rendering process.

NVIDIA paid for Gameworks integration, so we probably got more content out of the deal ... but technological progression this isn't.
 

J Tyran

New member
Dec 15, 2011
2,407
0
0
Pinky said:
NVIDIA paid for Gameworks integration, so we probably got more content out of the deal ... but technological progression this isn't.
This tinfoilery again? Nvidia don't pay anyone to use it, developers just ask to licence it. It's just a set of pre-repackaged visual FX and software tools and its convenient for developers, AMD are just making excuses because their software development pipeline is rubbish.
 

Pinky's Brain

New member
Mar 2, 2011
290
0
0
"Buy a GeForce graphics card, get The Witcher 3 and Batman: Arkham Knight free"

Few things are ever really free, not least of which is the labour provided by on site developer relations.
 

LordLundar

New member
Apr 6, 2004
962
0
0
Oh look, this song and dance again! Should we really go back through the history of the companies and find out just how many of these type of accusations were laid against each other? It's a pretty long list for both of them.
 

MonsterCrit

New member
Feb 17, 2015
594
0
0
CardinalPiggles said:
You can turn it off. Same with TressFX.

Then there's also the fact that it ruins the framerate on Nvidia cards as well.

Go home AMD, you're drunk.
Pretty much this...Why would they make something that also hits their own cards. It baffles the mind. Sorry AMD you're clearly just being a little butthurt. I mean the fact that thy opted to nVidia's hair thingie as opposed to yours must have been a real sting.
 

Adam Jensen_v1legacy

I never asked for this
Sep 8, 2011
6,651
0
0
Hey AMD users. Would you like to run Hairworks better than Nvidia users just for bragging rights? Here's how: http://gearnuke.com/witcher-3-run-hairworks-amd-nvidia-gpus-without-crippling-performance/

It actually works if anyone's interested.
 

CaitSeith

Formely Gone Gonzo
Legacy
Jun 30, 2014
5,374
381
88
ShakerSilver said:
There are A LOT of details missing in this story, makes it just look like AMD is name-calling.

It's not just AMD cards, but anything that doesn't support Nvidia's GameWorks is completely screwed over for optimization of GameWorks developed features and titles. This includes all ATi cards, Intel integrated graphics, and even Nvidia cards older than their GTX 900 series. A GTX 770 outperforms the GTX 960 due to being simply more powerful, but with GameWorks features activated, the 960 gets edges out the 770 because it's not properly optimized for these features.

To use GameWorks devs make a deal with Nvidia, forcing them to only optimize these features through Gameworks and only for Gameworks supported cards. The incentive for game makers to essentially stop support much of the PC market is because the deal also has Nvidia pay the devs and offer help in development. Basically Nvidia pay devs to only fully optimize games for their cards. Hell I doubt devs even a say in many of these cases or see any of this money, as it's mostly just the suits making money decisions, then telling the devs "OK you're going to use gameworks because Nvidia paid us".

Nvidia is making sure that "the way games are meant to be played" is through Nvidia, even if it means screwing their existing customers because they didn't want to upgrade right away. This is all in complete contrast to AMD who have made all their code open-source and try to make it compatible with all hardware, even their competition.
But in reality that wouldn't be such an issue if CD Projeckt Red had chosen not to use HairWorks and thus rejected Nvidia's monopolistic strategy.
 

deadish

New member
Dec 4, 2011
694
0
0
Nvidia isn't called the "graphic mafia" for nothing.

This vendor's tools historically completely suck, or only work for some period of time and then stop working, or only work if you beg the tools team for direct assistance. They have enormous, perhaps Dilbert-esque tools teams that do who knows what. Of course, these tools only work (when they do work) on their driver.

This vendor is extremely savvy and strategic about embedding its devs directly into key game teams to make things happen. This is a double edged sword, because these devs will refuse to debug issues on other vendor's drivers, and they view GL only through the lens of how it's implemented by their driver. These embedded devs will purposely do things that they know are performant on their driver, with no idea how these things impact other drivers.

Historically, this vendor will do things like internally replace entire shaders for key titles to make them perform better (sometimes much better). Most drivers probably do stuff like this occasionally, but this vendor will stop at nothing for performance. What does this mean to the PC game industry or graphics devs? It means you, as "Joe Graphics Developer", have little chance of achieving the same technical feats in your title (even if you use the exact same algorithms!) because you don't have an embedded vendor driver engineer working specifically on your title making sure the driver does exactly the right thing (using low-level optimized shaders) when your specific game or engine is running. It also means that, historically, some of the PC graphics legends you know about aren't quite as smart or capable as history paints them to be, because they had a lot of help.

Vendor A is also jokingly known as the "Graphics Mafia". Be very careful if a dev from Vendor A gets embedded into your team. These guys are serious business.

- http://www.richg42.blogspot.com/2014/05/the-truth-on-opengl-driver-quality.html
 

Jiffex

New member
Dec 11, 2011
165
0
0
Steven Bogos said:
(including Geralt's Tomb Raider [http://www.polygon.com/2015/3/24/8282437/the-witcher-3-geralt-dynamic-beard-growth-over-time].
So if you have the option turned off does the hair and beard stop growing?
 

tzimize

New member
Mar 1, 2010
2,391
0
0
I've got a 970 and my picture is as smooth as butter.

CDPR has set the bar again as far as I'm concerned.

Adam Jensen said:
BloodRed Pixel said:
Wait, we are talking a 29% FPS drop because of hairs?

Rediculously Epic Fail! I'd say.
And it's not even that big a deal. Geralt's hair looks amazing even without it. And it's not like you'll spend a lot of time looking at other people's hair to justify the drop in frame rate on any graphics card ever. It's useless eye candy.
Weeeeelll....while I'll always be of the opinion that a game can have a small pixel count and still be great, there are some things that really help when you want to build a world for players to more or less live in.

Realistic water is great. Realistic cloth is also awesome, I'd rather have no capes than the "flakes" of yore. Hair/fur has been one of the last bastions of the "real" stuff to put into games. I for one am thrilled to see technologies tackling it. The hair really looks fantastic imo, and while its not strictly necessary...it just one more thing to make you drown in the world.
 

Adam Jensen_v1legacy

I never asked for this
Sep 8, 2011
6,651
0
0
tzimize said:
I've got a 970 and my picture is as smooth as butter.

CDPR has set the bar again as far as I'm concerned.

Adam Jensen said:
BloodRed Pixel said:
Wait, we are talking a 29% FPS drop because of hairs?

Rediculously Epic Fail! I'd say.
And it's not even that big a deal. Geralt's hair looks amazing even without it. And it's not like you'll spend a lot of time looking at other people's hair to justify the drop in frame rate on any graphics card ever. It's useless eye candy.
Weeeeelll....while I'll always be of the opinion that a game can have a small pixel count and still be great, there are some things that really help when you want to build a world for players to more or less live in.

Realistic water is great. Realistic cloth is also awesome, I'd rather have no capes than the "flakes" of yore. Hair/fur has been one of the last bastions of the "real" stuff to put into games. I for one am thrilled to see technologies tackling it. The hair really looks fantastic imo, and while its not strictly necessary...it just one more thing to make you drown in the world.
You don't get it. I have nothing against Hairworks and TressFX. I'm saying that CD Projekt RED has made the hair look good even without it. There's no real reason to sacrifice fps for HairWorks. Fur is nice though, they could have done something about that while they were at it.

Also, new patch is up. It fixes Hairworks performance among other things. So if some of you really don't like the default hair you should have an easier time running the game with HW on.
 

rgrekejin

Senior Member
Mar 6, 2011
267
0
21
Steven Bogos said:
It's also not just AMD cards that are suffering performance drops from HairWorks, as reports are coming in that even Nvidia's GTX 980 drops down from 87.4 FPS to 62.2 FPS after turning on the feature.
Gosh, it's almost like adding thousands of tiny 8xMSAA objects is really graphically intensive or something. :p

Seriously, this is much ado about nothing. Hairworks isn't critical to anything - it's just a nice little graphical extra for those who have graphics cards beefy enough to support it, like Ubersampling in the Witcher 2. If it's killing your frame rate, just turn it off. Unless what AMD is saying here is that the mere presence of the gameworks code in the game is somehow crippling their performance even when disabled (which would be a hell of a story if true), then I'm not sure what their complaint is.

And if you really, really want to use hairworks without such a big performance hit, you can just muck around in your Rendering.ini file and lower the tessellation value for hairworks. I'm honestly not sure why that's not an option you can access from the game menu, to be honest.

http://pf.reddit.com/r/pcmasterrace/comments/36m595/psa_how_to_get_witcher_3_hairworks_running_on_all/
 

Ishigami

New member
Sep 1, 2011
830
0
0
I like AMD for their open source approaches and I?m a bit weary of Nvidia proprietary solutions.
But I got to admit that Nvidias proprietary solutions kind of work. Have a look at FreeSync and its limitations and then at G-Sync.
AMD is currently complaining about HairWorks, okay I see the issue of FPS loses for up to anything of 30%.
Nvidia is arguing this is due a difference in tessellation performance.
Considering that adjusting the tessellation in the Catalyst Control Center actually improves the condition this might very well be true.
Someone at Guru3D apparently figured this out before AMD.
So I really think AMD did a sloppy job at optimizing The Witcher 3 for their hardware. I mean if a fan can come up with this easy imporvment via a CCC profile only days after release surly AMD could have come up with it in the month they apparently have worked together with CDPR? HairWorks is not a last minute addition to the game after all?

HBAO+ is also an Nvidia proprietary solution. The performance impact is however the same on AMD and Nvidia cards.
 

ShakerSilver

Professional Procrastinator
Nov 13, 2009
885
0
0
Ishigami said:
So I really think AMD did a sloppy job at optimizing The Witcher 3 for their hardware.
You mean CDProject optimizing it for the AMD's hardware; which they can't because of the GameWorks program - they're only allowed to optimize GameWorks features for GameWorks supported cards (Nvidia's GTX 900 series and Titan Black). For AMD to make it work well on their hardware they have to wait until after launch and throw together their own patch well after people notice how poorly those features perform on their hardware.
CaitSeith said:
But in reality that wouldn't be such an issue if CD Projeckt Red had chosen not to use HairWorks and thus rejected Nvidia's monopolistic strategy.
True, which really makes me disappointed in CDPR. They would screw over a considerable chunk of their fanbase for the extra dough.
 

iniudan

New member
Apr 27, 2011
538
0
0
lacktheknack said:
And TressFX butchered Nvidia cards in Tomb Raider.

He who lives in a glass house...
Yes, but Nvidia had access to code to optimize TressFX their own hardware, AMD cannot have access to anything gamework related. Also the poorer performance of TressFX on Nvidia, is mostly due to poorer multicore performance of Nvidia, as their GPU have few high clock core, low bandwidth, while AMD GPU have lower clock but more core and higher bandwidth.

But AMD CCC as some override feature for tessellation to go around Nvidia shenanigan, which currently let you have better performance with hair physics on, with AMD hardware, as you can reduce tessellation to x8 and get something that still look good (look odd with anything lower then x8), instead of been stuck to x16, as that feature doesn't exist on Nvidia.
 

fat american

New member
Apr 2, 2008
250
0
0
I'm running a 970 on high settings with a couple of ultra settings turned on and I'm getting between 60 and 50 fps with the hair stuff turned all of the way up, so it might not have to do with just your graphics card. I've got a 4770k @4ghz and 16 gigs of 2440hz ram. It seems to me like the cpu might be a pretty bad bottleneck for hair works. I could be completely wrong, though. I'm not an expert by any stretch of the definition.
 

BloodRed Pixel

New member
Jul 16, 2009
630
0
0
tzimize said:
I've got a 970 and my picture is as smooth as butter.

CDPR has set the bar again as far as I'm concerned.

Adam Jensen said:
BloodRed Pixel said:
Wait, we are talking a 29% FPS drop because of hairs?

Rediculously Epic Fail! I'd say.
And it's not even that big a deal. Geralt's hair looks amazing even without it. And it's not like you'll spend a lot of time looking at other people's hair to justify the drop in frame rate on any graphics card ever. It's useless eye candy.
Weeeeelll....while I'll always be of the opinion that a game can have a small pixel count and still be great, there are some things that really help when you want to build a world for players to more or less live in.

Realistic water is great. Realistic cloth is also awesome, I'd rather have no capes than the "flakes" of yore. Hair/fur has been one of the last bastions of the "real" stuff to put into games. I for one am thrilled to see technologies tackling it. The hair really looks fantastic imo, and while its not strictly necessary...it just one more thing to make you drown in the world.
Weeeellll, while I see the necessity and challenge of developing new tech, realism has abolsutely nothing to do with being able to drown in an game, aka immersiveness. ;)
 

iniudan

New member
Apr 27, 2011
538
0
0
rgrekejin said:
iniudan said:
But AMD CCC as some override feature for tessellation to go around Nvidia shenanigan, which currently let you have better performance with hair physics on, with AMD hardware, as you can reduce tessellation to x8 and get something that still look good (look odd with anything lower then x8), instead of been stuck to x16, as that feature doesn't exist on Nvidia.
Actually, the default tessellation value for hairworks is 8x. You can turn it up to 16x or down to 4x, 2x, or 0x by editing your Rendering.ini file regardless of who made your card. 4x looks okay, but there are some graphical glitches at 2x and 0x.
It is the hair AA which is in the ini file not the tessellation.