Hair and beards.. things i will totally care about when slaughterign my way through whereever....
How do you mean? It's not like Hairworks is exactly revolutionary ... its AA is archaic, its performance profile across multiple architectures is pathetic. Oh and it actively constrains CDPR in developing its game because it's at the same time closed sourced AND intimately tied into the rendering process.J Tyran said:AMD throwing a strop again... This excuse is weak and old, even if it has some basis in truth should developers and consumers have to be limited by hardware with a minority share?
This tinfoilery again? Nvidia don't pay anyone to use it, developers just ask to licence it. It's just a set of pre-repackaged visual FX and software tools and its convenient for developers, AMD are just making excuses because their software development pipeline is rubbish.Pinky said:NVIDIA paid for Gameworks integration, so we probably got more content out of the deal ... but technological progression this isn't.
Pretty much this...Why would they make something that also hits their own cards. It baffles the mind. Sorry AMD you're clearly just being a little butthurt. I mean the fact that thy opted to nVidia's hair thingie as opposed to yours must have been a real sting.CardinalPiggles said:You can turn it off. Same with TressFX.
Then there's also the fact that it ruins the framerate on Nvidia cards as well.
Go home AMD, you're drunk.
But in reality that wouldn't be such an issue if CD Projeckt Red had chosen not to use HairWorks and thus rejected Nvidia's monopolistic strategy.ShakerSilver said:There are A LOT of details missing in this story, makes it just look like AMD is name-calling.
It's not just AMD cards, but anything that doesn't support Nvidia's GameWorks is completely screwed over for optimization of GameWorks developed features and titles. This includes all ATi cards, Intel integrated graphics, and even Nvidia cards older than their GTX 900 series. A GTX 770 outperforms the GTX 960 due to being simply more powerful, but with GameWorks features activated, the 960 gets edges out the 770 because it's not properly optimized for these features.
To use GameWorks devs make a deal with Nvidia, forcing them to only optimize these features through Gameworks and only for Gameworks supported cards. The incentive for game makers to essentially stop support much of the PC market is because the deal also has Nvidia pay the devs and offer help in development. Basically Nvidia pay devs to only fully optimize games for their cards. Hell I doubt devs even a say in many of these cases or see any of this money, as it's mostly just the suits making money decisions, then telling the devs "OK you're going to use gameworks because Nvidia paid us".
Nvidia is making sure that "the way games are meant to be played" is through Nvidia, even if it means screwing their existing customers because they didn't want to upgrade right away. This is all in complete contrast to AMD who have made all their code open-source and try to make it compatible with all hardware, even their competition.
This vendor's tools historically completely suck, or only work for some period of time and then stop working, or only work if you beg the tools team for direct assistance. They have enormous, perhaps Dilbert-esque tools teams that do who knows what. Of course, these tools only work (when they do work) on their driver.
This vendor is extremely savvy and strategic about embedding its devs directly into key game teams to make things happen. This is a double edged sword, because these devs will refuse to debug issues on other vendor's drivers, and they view GL only through the lens of how it's implemented by their driver. These embedded devs will purposely do things that they know are performant on their driver, with no idea how these things impact other drivers.
Historically, this vendor will do things like internally replace entire shaders for key titles to make them perform better (sometimes much better). Most drivers probably do stuff like this occasionally, but this vendor will stop at nothing for performance. What does this mean to the PC game industry or graphics devs? It means you, as "Joe Graphics Developer", have little chance of achieving the same technical feats in your title (even if you use the exact same algorithms!) because you don't have an embedded vendor driver engineer working specifically on your title making sure the driver does exactly the right thing (using low-level optimized shaders) when your specific game or engine is running. It also means that, historically, some of the PC graphics legends you know about aren't quite as smart or capable as history paints them to be, because they had a lot of help.
Vendor A is also jokingly known as the "Graphics Mafia". Be very careful if a dev from Vendor A gets embedded into your team. These guys are serious business.
- http://www.richg42.blogspot.com/2014/05/the-truth-on-opengl-driver-quality.html
So if you have the option turned off does the hair and beard stop growing?Steven Bogos said:(including Geralt's Tomb Raider [http://www.polygon.com/2015/3/24/8282437/the-witcher-3-geralt-dynamic-beard-growth-over-time].
Weeeeelll....while I'll always be of the opinion that a game can have a small pixel count and still be great, there are some things that really help when you want to build a world for players to more or less live in.Adam Jensen said:And it's not even that big a deal. Geralt's hair looks amazing even without it. And it's not like you'll spend a lot of time looking at other people's hair to justify the drop in frame rate on any graphics card ever. It's useless eye candy.BloodRed Pixel said:Wait, we are talking a 29% FPS drop because of hairs?
Rediculously Epic Fail! I'd say.
You don't get it. I have nothing against Hairworks and TressFX. I'm saying that CD Projekt RED has made the hair look good even without it. There's no real reason to sacrifice fps for HairWorks. Fur is nice though, they could have done something about that while they were at it.tzimize said:I've got a 970 and my picture is as smooth as butter.
CDPR has set the bar again as far as I'm concerned.
Weeeeelll....while I'll always be of the opinion that a game can have a small pixel count and still be great, there are some things that really help when you want to build a world for players to more or less live in.Adam Jensen said:And it's not even that big a deal. Geralt's hair looks amazing even without it. And it's not like you'll spend a lot of time looking at other people's hair to justify the drop in frame rate on any graphics card ever. It's useless eye candy.BloodRed Pixel said:Wait, we are talking a 29% FPS drop because of hairs?
Rediculously Epic Fail! I'd say.
Realistic water is great. Realistic cloth is also awesome, I'd rather have no capes than the "flakes" of yore. Hair/fur has been one of the last bastions of the "real" stuff to put into games. I for one am thrilled to see technologies tackling it. The hair really looks fantastic imo, and while its not strictly necessary...it just one more thing to make you drown in the world.
Gosh, it's almost like adding thousands of tiny 8xMSAA objects is really graphically intensive or something.Steven Bogos said:It's also not just AMD cards that are suffering performance drops from HairWorks, as reports are coming in that even Nvidia's GTX 980 drops down from 87.4 FPS to 62.2 FPS after turning on the feature.
You mean CDProject optimizing it for the AMD's hardware; which they can't because of the GameWorks program - they're only allowed to optimize GameWorks features for GameWorks supported cards (Nvidia's GTX 900 series and Titan Black). For AMD to make it work well on their hardware they have to wait until after launch and throw together their own patch well after people notice how poorly those features perform on their hardware.Ishigami said:So I really think AMD did a sloppy job at optimizing The Witcher 3 for their hardware.
True, which really makes me disappointed in CDPR. They would screw over a considerable chunk of their fanbase for the extra dough.CaitSeith said:But in reality that wouldn't be such an issue if CD Projeckt Red had chosen not to use HairWorks and thus rejected Nvidia's monopolistic strategy.
Yes, but Nvidia had access to code to optimize TressFX their own hardware, AMD cannot have access to anything gamework related. Also the poorer performance of TressFX on Nvidia, is mostly due to poorer multicore performance of Nvidia, as their GPU have few high clock core, low bandwidth, while AMD GPU have lower clock but more core and higher bandwidth.lacktheknack said:And TressFX butchered Nvidia cards in Tomb Raider.
He who lives in a glass house...
Weeeellll, while I see the necessity and challenge of developing new tech, realism has abolsutely nothing to do with being able to drown in an game, aka immersiveness.tzimize said:I've got a 970 and my picture is as smooth as butter.
CDPR has set the bar again as far as I'm concerned.
Weeeeelll....while I'll always be of the opinion that a game can have a small pixel count and still be great, there are some things that really help when you want to build a world for players to more or less live in.Adam Jensen said:And it's not even that big a deal. Geralt's hair looks amazing even without it. And it's not like you'll spend a lot of time looking at other people's hair to justify the drop in frame rate on any graphics card ever. It's useless eye candy.BloodRed Pixel said:Wait, we are talking a 29% FPS drop because of hairs?
Rediculously Epic Fail! I'd say.
Realistic water is great. Realistic cloth is also awesome, I'd rather have no capes than the "flakes" of yore. Hair/fur has been one of the last bastions of the "real" stuff to put into games. I for one am thrilled to see technologies tackling it. The hair really looks fantastic imo, and while its not strictly necessary...it just one more thing to make you drown in the world.
It is the hair AA which is in the ini file not the tessellation.rgrekejin said:Actually, the default tessellation value for hairworks is 8x. You can turn it up to 16x or down to 4x, 2x, or 0x by editing your Rendering.ini file regardless of who made your card. 4x looks okay, but there are some graphical glitches at 2x and 0x.iniudan said:But AMD CCC as some override feature for tessellation to go around Nvidia shenanigan, which currently let you have better performance with hair physics on, with AMD hardware, as you can reduce tessellation to x8 and get something that still look good (look odd with anything lower then x8), instead of been stuck to x16, as that feature doesn't exist on Nvidia.