AMD: Nvidia "Completely Sabotaged" Witcher 3 Performance on Our Cards

 Pages 1 2 3 NEXT
 

AMD: Nvidia "Completely Sabotaged" Witcher 3 Performance on Our Cards

the witcher 3 wild hunt screenshot 2

AMD's Richard Huddy says Nvidia's new "HairWorks" technology completely sabotaged The Witcher 3's performance.

The eternal mudslinging between graphics card giants AMD and Nvidia continues today, with AMD claiming that Nvidia went so far as to deliberately sabotage recently-released The Witcher 3's performance on AMD cards.

""We've been working with CD Projeckt Red from the beginning. We've been giving them detailed feedback all the way through. Around two months before release, or thereabouts, the GameWorks code arrived with HairWorks, and it completely sabotaged our performance as far as we're concerned," AMD's chief gaming scientist Richard Huddy told Ars Technica.

"We were running well before that," he added, "It's wrecked our performance, almost as if it was put in to achieve that goal."

Additionally, despite Nvidia's claims that developers have free range of their technologies source codes, Huddy says that this was not the case with CD Projekt Red and HairWorks. "I was [recently] in a call with one of the management at CD Projekt," said Huddy, "and what I heard then was that they didn't have access to the source code."

HairWorks, as the name would suggest, is a kind of technology that allows games to simulate and render fur and hair (including Geralt's dynamic beard growth) realistically. AMD has its own competing technology, TressFX, which was best used in Square Enix's Tomb Raider.

It's also not just AMD cards that are suffering performance drops from HairWorks, as reports are coming in that even Nvidia's GTX 980 drops down from 87.4 FPS to 62.2 FPS after turning on the feature.

Source: Ars Technica

Permalink

You can turn it off. Same with TressFX.

Then there's also the fact that it ruins the framerate on Nvidia cards as well.

Go home AMD, you're drunk.

Sounds like a bad deal all around. Hair simulation has always been a sticking point with PC graphics and I don't see that changing anytime soon.

CardinalPiggles, they can't send in people to fix it ... which is what NVIDIA did with TressFX. The two situations are fundamentally different (which is not to say what NVIDIA is doing is inherently wrong).

I've got a 780 and even running Hairworks on geralt only can knock my frames down to as low as 20fps. Considering I can run the game on ultra at 60fps (though it's smoother and less jittery when locked to 30 for me), that's a massive penalty for such a tiny bit of extra flash. Don't get me wrong, it's pretty nice to look at and seeing some of the combat vids with it enabled shows how it really shines, but it's more of a novelty really.

Anyone who shelled out for a 970 or 80 thinking it would drastically affect the quality of the game must be feeling kind of burned on this. The difference is noticeable when it's on, but it's not significant to warrant such costs. Though I imagine playing this game with a smooth 60 fps at 2k or 4k is a nice perk.

I'm pretty sure we won't see many games pushing that feature in the near future considering the resources it demands. Maybe when 970's and 80's series cards become the norm will this feature be of any real relevance, but as of now it's just for bragging rights and that's about it.

Game still looks gorgeous without it on.

There are A LOT of details missing in this story, makes it just look like AMD is name-calling.

It's not just AMD cards, but anything that doesn't support Nvidia's GameWorks is completely screwed over for optimization of GameWorks developed features and titles. This includes all ATi cards, Intel integrated graphics, and even Nvidia cards older than their GTX 900 series. A GTX 770 outperforms the GTX 960 due to being simply more powerful, but with GameWorks features activated, the 960 gets edges out the 770 because it's not properly optimized for these features.

To use GameWorks devs make a deal with Nvidia, forcing them to only optimize these features through Gameworks and only for Gameworks supported cards. The incentive for game makers to essentially stop support much of the PC market is because the deal also has Nvidia pay the devs and offer help in development. Basically Nvidia pay devs to only fully optimize games for their cards. Hell I doubt devs even a say in many of these cases or see any of this money, as it's mostly just the suits making money decisions, then telling the devs "OK you're going to use gameworks because Nvidia paid us".

Nvidia is making sure that "the way games are meant to be played" is through Nvidia, even if it means screwing their existing customers because they didn't want to upgrade right away. This is all in complete contrast to AMD who have made all their code open-source and try to make it compatible with all hardware, even their competition.

Sounds like a shitty system that doesn't work well for anyone, less sabotage and more failure for Nvidia.

ShakerSilver:
. This is all in complete contrast to AMD who have made all their code open-source and try to make it compatible with all hardware, even their competition.

That's what I like about AMD, their philosophy is far more consumer friendly and less greedy all around.

That is the second accusation this week... it's just starting to feel manufactured. I call bullshit on an optional feature fucking up your performance. It fucks up everyone's performance.

Whats really messed up about all this is that these two companies compete with stupid shit that should be left up to the developers. The cards themselves should render whatever, and the developers should have the tools to make their product work with ANY card with ALL the features enabled. Instead NVIDIA has PhysX, held up development on SLI for years after buying out 3DFX... I don't like NVIDIA very much, and have been an AMD guy for a long time, even before they bought into the GFX development area.
I don't necessarily need these hair features, but its kind of sad that GPU devs are doing crap like this to take things away from end users (especially the ones who don't know tech very well). NVIDIA, like INTEL is overpriced for what you get, AMD has always offered quality for the price IMO, and I've always been happy with their technology and I've owned both sides.
I've got my opinions and bias and I'm not afraid to say it. I owned NVIDIA before they were big and even then always had issues with their cards. ATI/AMD has had in my experience significantly less issues with their hardware, though I may not have ALL the performance that NVIDIA gets, I'm also left with money in my pocket to put toward other things I want, just like buying an AMD processor over an overpriced Intel processor.
I've been building PC's since before there were 3D accelerator cards, and I've always found AMD to somehow work better in the long run, last longer and just make me feel I made the right choice as a consumer. The iterations of Intel processors I've bought have crapped out far quicker, have either underperformed or have RMA'd the chip (never RMA'd an AMD in my experiences and I've bought 100's as a professional PC builder). My bias is prevalent, but its not without experience. Same with NVIDIA cards. TNT cards had driver issues back in the day, but at the time ATI was hit or miss until AMD bought them out. There have been some driver issues with AMD cards but they've been fixed relatively quickly and I'm currently running the beta catalyst drivers with no issues on any of my games, also using the AMD Turbo function on my 8-core which boosts the clock speed (watercooling is so awesome). Had it up to 7.2GHz but I'm unsure how accurate the reporting is... Core-Temp program said it was operating at that speed so I'm inclined to believe it.

RicoADF:
Sounds like a shitty system that doesn't work well for anyone, less sabotage and more failure for Nvidia.

ShakerSilver:
. This is all in complete contrast to AMD who have made all their code open-source and try to make it compatible with all hardware, even their competition.

That's what I like about AMD, their philosophy is far more consumer friendly and less greedy all around.

Plus AMD doesn't tend to make their cards as expensive as hell. They are affordable and in my opinion are just as good as what Nvidia offers. I'm running a XFX HD 7970 with The Witcher 3 on Ultra settings and the game runs great. I want to get two R9 290Xs later this year and crossfire them.

AMD are probably right on this, but of course you can turn off Hairworks.

Basically whenever you see a video card maker's logo on a game, someone's being a huge arsehole about it.AMD didn't get access to the final code until too close to release to make a proper driver for it, even without the Hair nonsense.

The same happens the other way round with games made in association with AMD

Anyone figure also that the feature is future-proofing a bit for DX12? Would think this would be something DX12 was made for.

CardinalPiggles:
You can turn it off. Same with TressFX.

Then there's also the fact that it ruins the framerate on Nvidia cards as well.

Go home AMD, you're drunk.

This, plus the fact that the game is still perfectly playable with HairWorks enabled. 30 FPS is completely playable, despite what some more zealous PC enthusiasts than I might say. If I hit 60 with any game, I'm living the dream. There's always a point where adding or subtracting values to your FPS counter does absolutely jack shit.

IamLEAM1983:

CardinalPiggles:
You can turn it off. Same with TressFX.

Then there's also the fact that it ruins the framerate on Nvidia cards as well.

Go home AMD, you're drunk.

This, plus the fact that the game is still perfectly playable with HairWorks enabled. 30 FPS is completely playable, despite what some more zealous PC enthusiasts than I might say. If I hit 60 with any game, I'm living the dream. There's always a point where adding or subtracting values to your FPS counter does absolutely jack shit.

You're entitled to your opinion, but I completely disagree. 30 FPS is "playable" yes, but 15 FPS was "playable" back in the n64 days. I would much,much rather have a game run at 60 FPS in native resolution with graphical bells and whistles (like hairworks) turned off. More than anything i value a smooth gaming experience, and the difference between 30 and 60 FPS is like night and day.

I do, however, agree that anything above 60 FPS isn't really necessary.

I find 30FPS to be unplayable. I find it rather odd really.

I understand that a lot of people play at 30. And therefore, it should be fine and playable, as most people seem to be able to do so.

But, when I try it, it looks so off, it's completely unresponsive and jarring to play.

It's a little frustrating, honestly.

Anyway, AMD blaming Nvidia for this? That seems a bit childish.

It's CD Projekt Red's product. It is up to them to ensure it works on everyone's systems as best as possible. If they choose to use software that, when tested causes a significant decrease in performance for those on a certain manufacturer's graphics card, the buck absolutely stops with them.

If Nvidia spends money, time and effort on the development of new technologies, I see no reason they're required to release it for free.

Considering my GTX 970 OC'ed (i7 3770 4.3Ghz blablabla it should have been "enough") is having serious problems keeping 40 FPS (Hair, Shadows, some other stuff = off), I really don't understand what they are about. Performance on that game (for those that care about 60 FPS, which should be a pretty fucking large group around these parts) is a problem all around right now.

Unless of course high-end AMD-only cards are getting sub 15 FPS. Is this happening?

My 780 Ti is running the game on Ultra with HairWorks off, not 100% consistent 60fps, but most of the time. HairWorks on, it goes to around 30. If AMD cards are having problems, that's kind of just a problem with AMD cards sucking. You get what you pay for.

The articles on this are poorly worded, they didn't make it clear if the active hair feature made things worse or if performance got shot the moment that API got added to the game. Need to be very clear on that when you make stuff like this known, otherwise you look quite foolish. Especially since your TressFX feature when running caused havoc on all but modern AMD cards.

But I have no trouble believing the mere API addition nukes the opposition, it has been known for some time that GameWorks stuff only ever runs properly exclusively on new Nvidia cards. That drives sales very specifically to their latest models, which is evil genius right there, why wouldn't they do that given the opportunity...
Better still as they update GameWorks for newer games they can keep pushing everyone's stuff out of the market, including their own older cards.

It's also kinda funny how Nvidia Experience, when set to optimize The Witcher 3, automatically chooses to turn hairworks off even on my geforce 780.

Witcher 3 runs on my old ATI 5770. With 4 GB of RAM and an i5 750

And it runs very well. Take a while to load and it is on MOSTLY low settings, but once in the game it runs great.

If a 6 year old mid range ATI card can do it, then the newer ones will do much better. Sorry AMD. Bullshit.

" This is all in complete contrast to AMD who have made all their code open-source and try to make it compatible with all hardware, even their competition."

True.

Also, Steven Bogos, you missed a joke. TrissFX and YeniFur...
Lol...

And TressFX butchered Nvidia cards in Tomb Raider.

He who lives in a glass house...

TressFX is pretty bad by itself. It completely butchers the performance on Nvidia cards as well. This is the nature of hair and water physics - they are extremely performance intensive and really only work if you can get to the core of the GPU. AMD tech won't work with an Nvidia card and vice-versa.

It's not like Nvidia is much better, but the things that are tanking performance on your GPUs are your shitty drivers, your bad hardware configurations, and your sub-par software technologies. Come back to us when you fix those first.

Are we just going to pretend that AMD didnt introduce TressFX to do pretty much the exact same thing to Nvidia a few years ago?

HairWorks makes my GTX 970 have pretty low frame rate from what it should pull. So i call it's all AMD PR bullshit.

Nvidia crippled The Witcher 3 on anything other than 970, 980 and Titan X. They used GameWorks to sell more 900 series of cards. Nothing else can run the game on Ultra with 60 fps. I canceled my pre-order because of GameWorks, but ended up pre-ordering at the last minute to get the 20% discount for owning both previous titles. And you know what? The game runs like a dream on my R9 280x. I don't care about Hairworks. And other than that piece of tech, The Witcher 3 runs better on AMD. Nvidia users have been reporting some freezing and stuttering issues. Nothing like that on my end. Frame latency is also a dream. I can barely tell the difference between 60 fps and 40 fps. That's how good it runs on AMD (without hairworks).

Imperioratorex Caprae:
Whats really messed up about all this is that these two companies compete with stupid shit that should be left up to the developers. The cards themselves should render whatever, and the developers should have the tools to make their product work with ANY card with ALL the features enabled. Instead NVIDIA has PhysX, held up development on SLI for years after buying out 3DFX... I don't like NVIDIA very much, and have been an AMD guy for a long time, even before they bought into the GFX development area.
I don't necessarily need these hair features, but its kind of sad that GPU devs are doing crap like this to take things away from end users (especially the ones who don't know tech very well). NVIDIA, like INTEL is overpriced for what you get, AMD has always offered quality for the price IMO, and I've always been happy with their technology and I've owned both sides.
I've got my opinions and bias and I'm not afraid to say it. I owned NVIDIA before they were big and even then always had issues with their cards. ATI/AMD has had in my experience significantly less issues with their hardware, though I may not have ALL the performance that NVIDIA gets, I'm also left with money in my pocket to put toward other things I want, just like buying an AMD processor over an overpriced Intel processor.
I've been building PC's since before there were 3D accelerator cards, and I've always found AMD to somehow work better in the long run, last longer and just make me feel I made the right choice as a consumer. The iterations of Intel processors I've bought have crapped out far quicker, have either underperformed or have RMA'd the chip (never RMA'd an AMD in my experiences and I've bought 100's as a professional PC builder). My bias is prevalent, but its not without experience. Same with NVIDIA cards. TNT cards had driver issues back in the day, but at the time ATI was hit or miss until AMD bought them out. There have been some driver issues with AMD cards but they've been fixed relatively quickly and I'm currently running the beta catalyst drivers with no issues on any of my games, also using the AMD Turbo function on my 8-core which boosts the clock speed (watercooling is so awesome). Had it up to 7.2GHz but I'm unsure how accurate the reporting is... Core-Temp program said it was operating at that speed so I'm inclined to believe it.

I don't think you know how GPU's work. The reason they are so much more powerfull is that their dedicated to very specific tasks. But that also means that you can't just tell a GPU what to do like you can a CPU. That's the reason why all these GPU developers have these special features like PhySx and Stuff. These are things a GPU would not normally be able to do, because it's not designed for that task. So in order to be able to do these things they need to be designed to do them even on an hardware level. Which is also why Nvidia cards suck and doing stuff developed by AMD and the other way around.
You can't just "leave it to the developer", because in that cause we wouldn't have any of these special features.

And people wonder why I stay behind the graphics card curve... This nonsense is why really, I'd rather not have to upgrade to a certain manufacturer's ultra-awesome-super-amazing-hyper$$ card just to play new releases...

For now I will stick with my GTX 750 and play TF2, Kerbal Space Program, Space Engineers, Dark Souls, Killing Floor and War Thunder... And now back to trolling low level players with the M10 gun carriage from across the map.

lacktheknack:
And TressFX butchered Nvidia cards in Tomb Raider.

NVIDIA could and did go and fix it ...

Eh. This is par for the course.

Ever seen those 'best played with nvidia' things? (The way it's meant to be played or some such thing?)

That sounds like a good thing right?

Well, not really. It means the developer got a cash incentive and technical support from Nvidia.
That's all well and good of course, but Nvidia has a vested interest in ensuring that whatever that game implements kinda breaks AMD cards...

Now, the Devs themselves don't actually like that, because, obviously, if half the gamers that might buy your game can't play it because the code Nvidia handed you breaks your game on their systems that's going to hurt you in the long run...

But, guess what? Nvidia likes it of course. Especially since gamers often fail to see what's really happened, and blame AMD for 'broken drivers', 'bad performance' and so on.

Now, that's not to say AMD is innocent. They certainly attempted to pull the same trick often enough (especially back in the ATI days). But they don't seem to have as much money to throw at it, so they often seem to get the short end of the stick.

There's also this: http://www.dsogaming.com/news/ex-nvidia-driver-developer-on-why-every-triple-a-games-ship-broken-multi-gpus/

There's more to that specific rant than the article is quoting, but what it comes down to is this:

Games don't work. They are, most of the time, released in a state that violates basic rules about how graphics cards actually work.
What happens is, Nvidia and AMD identify what game is running, then actively fix every mistake the game is making at the driver level before handing the fixed code to the GPU...

This happens partly by accident, but part of it is that DirectX and OpenGL are so abstract and convoluted it's almost impossible to code something that actually works correctly from the point of view of the actual 3d rendering hardware.

Which means... The drivers contain millions of lines of convoluted code to deal with a million different edge cases of things that just don't work right, and jiggling all the game code around into a state that's actually capable of running...

No wonder the open-source AMD linux drivers are so bad...
Driver writing is clearly not as simple as it sounds, and in fact sounds like the worst kind of convoluted nightmare you could possibly face as a programmer;
Messy, unelegant code, full of special cases, multiple code paths, conditional code execution and all other kinds of seriously nasty stuff...

It's a small miracle any of it works at all if it's that bad...

Steven Bogos:
tween 30 and 60 FPS is like night and day.

I do, however, agree that anything above 60 FPS isn't really necessary.

It's nessesary for VR. Essential even. 60fps is a [b]bare minimum[/i]. As in, letting the framerate drop below 60fps, ever is not a good idea.
But... VR is an edge case.

Sure, framerates (or rather, input-display latency, to get at the heart of the issue) in VR are not just nice to have, but critical to avoiding people becoming sick while playing, but, as should be clear, VR is very much a niche thing. (for now anyway. Who can predict the future)

this article and conversation is exactly why i was exclusively a console gamer for most of my life, and primarily one now. though this latest generation's shenanigans have almost pushed me off of console gaming entirely. one or two more missteps, and i'm OUT.

ShakerSilver:
There are A LOT of details missing in this story, makes it just look like AMD is name-calling.

That's the impression I'm getting too, AMD is just coming off as rather sulky here. And my household has one Nvidia machine and one AMD. I'm not in a position to be judgemental I use both for different things. But come on...

AMD throwing a strop again... This excuse is weak and old, even if it has some basis in truth should developers and consumers have to be limited by hardware with a minority share?

I'm no fanboy, I want AMD to be better, the PC community needs them to be better but the focus on low to mid range hardware and APUs has been coming back to bite them and now they can barely afford any R&D. The 300 series is just a re-badging of existing chips with HBM memory slapped on in all but one of the cards, a chip first seen in 2013...

That says it all really, they have always been flaky with the drivers too.

Wait, we are talking a 29% FPS drop because of hairs?

Rediculously Epic Fail! I'd say.

BloodRed Pixel:
Wait, we are talking a 29% FPS drop because of hairs?

Rediculously Epic Fail! I'd say.

And it's not even that big a deal. Geralt's hair looks amazing even without it. And it's not like you'll spend a lot of time looking at other people's hair to justify the drop in frame rate on any graphics card ever. It's useless eye candy.

vallorn:
And people wonder why I stay behind the graphics card curve... This nonsense is why really, I'd rather not have to upgrade to a certain manufacturer's ultra-awesome-super-amazing-hyper$$ card just to play new releases...

For now I will stick with my GTX 750 and play TF2, Kerbal Space Program, Space Engineers, Dark Souls, Killing Floor and War Thunder... And now back to trolling low level players with the M10 gun carriage from across the map.

Yep, this is one reason why I'm still going strong on a GTX 465 and will probably jump to around your card's generation (or go with AMD if that camp is looking good at the time) in maybe a year. Nothing I really want to play has my 465 screaming in pain anyways.

I'll just let the two professional corporations make the 10 year olds fighting the early 90's console wars look like the mature group.

Adam Jensen:
It's useless eye candy.

But, it's the FUTURE! /sarcasm

 Pages 1 2 3 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here