AMD, Nvidia Slap-Fight Over DirectX 11

 Pages PREV 1 2
 

thiosk:
Its adorable that ATI is trying to smear Nvidia over physX being an unpopular format while simultaneously suggesting how important it is to be running DX11 RIGHT NOW.

PhysX being unpopular is entirely AGEIA/Nvidia's fault, for going from ridiculously proprietary, to somewhat less proprietary. If you want PhysX in more games, then make PhysX more widely implementable. At least with DX11, anyone who wants to can make a card, or a game, that is compatible with it, can. AGEIA made their silly "this card is way better than a graphics card, this couldn't possibly be done without a dedicated card" attempt, and failed, and Nvidia picked them up, and said "we managed to implement their fancy stuff in CUDA, but it couldn't possibly be done otherwise". There's nothing so special about CUDA that ATI couldn't implement PhysX directly into their setups, but Nvidia is grasping tightly to what they perceive as a competitive advantage. All the while, shooting PhysX in the foot. I think the biggest problem is that Nvidia is trying to profit off PhysX in too many ways: as a licensor of the physics engine to game developers, and as a bullet-point on their graphics cards. It only makes sense for game devs if a broad enough portion of the market can take advantage of it, and it only makes sense for buyers of video cards if enough games take advantage of it. If they'd pick one (and I think the licensing end is probably more easily profitable), they'd be far better off.

AMD/ATI taking advantage of Nvidia's GT300 yield problems, and using it to differentiate themselves from Nvidia? That just makes sense to me. I don't think it matters all that much, but ATI has the single-chip performance crown for the moment, and they've got a broader feature set in terms of DirectX support. Wouldn't you be crowing about that, if you were in their shoes? I would also point out that unlike the 10-10.1 conversion, which Nvidia chose to stay out of (because it obviously wasn't worth bringing out a whole new chip to support), their current posturing on DX11 is entirely because they've got issues with their new cards. They're not choosing to remain with their current chip longer because DX11 isn't worth it. If anyone's the hypocrite, it's them.

EDIT: Disclaimer - Everything I just said was not sourced; this is just my understanding of the AGEIA/PhysX/Nvidia/CUDA situation based on articles I've read over the last few years. My interest in hardware minutiae fluctuates based on chronological proximity to a new computer purchase, so everything above, if it doesn't sound right to you, could probably use some fact-checking.

Geoffrey42:

thiosk:
Its adorable that ATI is trying to smear Nvidia over physX being an unpopular format while simultaneously suggesting how important it is to be running DX11 RIGHT NOW.

PhysX being unpopular is entirely AGEIA/Nvidia's fault, for going from ridiculously proprietary, to somewhat less proprietary. If you want PhysX in more games, then make PhysX more widely implementable. At least with DX11, anyone who wants to can make a card, or a game, that is compatible with it, can. AGEIA made their silly "this card is way better than a graphics card, this couldn't possibly be done without a dedicated card" attempt, and failed, and Nvidia picked them up, and said "we managed to implement their fancy stuff in CUDA, but it couldn't possibly be done otherwise". There's nothing so special about CUDA that ATI couldn't implement PhysX directly into their setups, but Nvidia is grasping tightly to what they perceive as a competitive advantage. All the while, shooting PhysX in the foot. I think the biggest problem is that Nvidia is trying to profit off PhysX in too many ways: as a licensor of the physics engine to game developers, and as a bullet-point on their graphics cards. It only makes sense for game devs if a broad enough portion of the market can take advantage of it, and it only makes sense for buyers of video cards if enough games take advantage of it. If they'd pick one (and I think the licensing end is probably more easily profitable), they'd be far better off.

AMD/ATI taking advantage of Nvidia's GT300 yield problems, and using it to differentiate themselves from Nvidia? That just makes sense to me. I don't think it matters all that much, but ATI has the single-chip performance crown for the moment, and they've got a broader feature set in terms of DirectX support. Wouldn't you be crowing about that, if you were in their shoes? I would also point out that unlike the 10-10.1 conversion, which Nvidia chose to stay out of (because it obviously wasn't worth bringing out a whole new chip to support), their current posturing on DX11 is entirely because they've got issues with their new cards. They're not choosing to remain with their current chip longer because DX11 isn't worth it. If anyone's the hypocrite, it's them.

EDIT: Disclaimer - Everything I just said was not sourced; this is just my understanding of the AGEIA/PhysX/Nvidia/CUDA situation based on articles I've read over the last few years. My interest in hardware minutiae fluctuates based on chronological proximity to a new computer purchase, so everything above, if it doesn't sound right to you, could probably use some fact-checking.

Sounds quite on the money but Dx11 doesn't mean anything until Windows 7 gets released since Dx11 will be added to vista with a windows update so Nvidia still have until the end of next month to pull thier finger out instead of talking rubbish.

PhysX is a tricky one as the SDK supports Software physics and thus available on PC,Wii,PS3,360 and the default physics of the Unreal engine but also supports Hardware accelerated physics on their 3D cards.
PhysX won't last but the experience will have been extremly valuable even if CUDA isn't the programing language used to write it plus they must have made some money of it.

The 5870 is a nice card but at £300 its a little pricey compared to their own 4890 (half the price) considering the games for 2010 are

Bioshock 2
Starcraft 2: wings of liberty
Diablo 3
Guildwars 2
Assassins creed 2
Left 4 dead 2

Basically nothing thats going to challenge todays £100 to £150 graphics cards the 5000 series and Nvidia 300 series are dead in the water IMO

[H]ard OCP
Wolfenstein Gameplay Performance and IQ Article

State of the Game

Wolfenstein comes to us as yet another entry in a growing catalog of games which bare distinct signs of being console-focused titles. The lightweight graphics, memory, and processor requirements, the nonfunctional anti-aliasing, the clumsy menu system, and the mere handful of customizable graphics options all combine to show us that this chapter of B.J.'s World War II exploits was geared for the console and only adapted for the PC. While it can be argued that it makes sense from a business perspective for developers to focus on console development first, it does leave PC games wondering what exactly is next for us.

Further, it throws into sharp relief the dubious value of escalating the GPU arms race yet again. We are rapidly approaching the unveiling and launching of new silicon from both AMD and NVIDIA, and we are forced to wonder exactly what the point is. If a $200 video card will play so many new games (Wolfenstein, Call of Juarez: Bound in Blood, Ghostbusters, Demigod) so brilliantly on a 30" monitor with the highest resolution and graphical fidelity right now, and if this trend is going to continue (which it seems to be), why do we need to invest in new graphics cards with the frequency at which NVIDIA and AMD would have us believe we should?

Without a doubt, more demanding technology showcase games like Crysis and Arma II will continue to be developed. But does it really make sense in the current economy to invest in yet more graphics horsepower for such a small clutch of tech-heavy games among such a wealth of mediocrity?

http://www.hardocp.com/article/2009/09/01/wolfenstein_gameplay_performance_iq/10

Hell I can't even see the need to upgrade my 9800GTX and thats from April 2008 when that got released.

DX 10 was a failed experiment as was Vista, DX 11 is a performance boost not a hit like 10 and 7 will be worth it.

Mmm, can't wait for these tech demo games! After all, graphics are everything it seems a lot now.

Since they make games first for the console (that's where the money is), advanced tech is slow until the new consoles are released. That may be 3 years away.

 Pages PREV 1 2

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here