Nvidia's Dual-GPU Titan Z Video Card is Now Available for $2,999

 Pages PREV 1 2
 

So who exactly is this supposed to be for?

Waylon Smithers:
a healthy mix of the rich and the ignorant

And who is that?!? Those who lack the ability to set up an SLI bridge and willing to pay double for it to be done for them?

Captcha: Its super effective
Quiet Captcha... you are NOT helping.

Edit: Seriously. I just ran through Risen 2 in triple wide while running winamp milkdrop visualization for desktop mode on a non linked single 1gb 650 and a dedicated Ion card for PHYSX and never dropped below 60fps.

Does an ePeen that insecure even exist?

Also... for "industry" would they not go with Work station cards such as the quadro line long before looking at this?

Wowie! A pointless piece of PC garbage! What a good way to spend $3000. Buy a GPU that is absolutely unnecessary for 100% of games, and won't be necessary in at least the next 10 years.

No but sarcasm aside, I know some loaded morons are going to buy this card, despite it being aimed at very large industries. Even then, it's not even really that important for industry either.

Three slots? Sounds like one beast of a heat sink. Why not just throw a water block in the box, too? Knowing the target market for cards like this, I'd say that 90% are gonna rip the fan and heat sink off and integrate it into their loop or make a new loop just for the GPUs as soon as they find a water block that fits this beast.

Anyone hearing Also Sprach Zarathustra when looking at the picture?

Captcha: flux capacitor Yes, this thing doubles as a Delorean time travel processor.

$3000 for this is a pretty awesome deal if you have a good use for it's primary strengths - compute number-crunching, CUDA and rendering.

Dexterity:
Wowie! A pointless piece of PC garbage! What a good way to spend $3000. Buy a GPU that is absolutely unnecessary for 100% of games, and won't be necessary in at least the next 10 years.

No but sarcasm aside, I know some loaded morons are going to buy this card, despite it being aimed at very large industries. Even then, it's not even really that important for industry either.

I think you need to learn what this card is for. nVidia is advertising it as both a compute/CUDA and gaming card, but they are only throwing "gaming" in there to spread market awareness faster.

EndlessSporadic:
You guys are bashing this card rather hard considering it wasn't targeted at you. Can we calm down a bit, stop the poop-slinging, and consider for a moment that not every video card known to man is targeted at you? These cards are targeted towards game companies who want to do fancy displays in conventions. This card isn't worth it for you, but it is for the developers who do dual-monitor 4K displays with all settings maxed on the expo floor.

And yet, that won't stop people who show off they're PC the way other people show off fast foreign cars from getting them. Your right though, this isn't aided at gamers, it's aimed at studio's, movie and gaming.

>.>
<.<

Still not going to stop me from wondering what any one who's not working for a studio of some flavor that might buy one of these is trying to compensate for.

Dexterity:
Even then, it's not even really that important for industry either.

It is, when you have a task that can run into hundreds or thousands of dollars per day (depending on the size of the team) and that can take 250 hours to render each individual frame I am sure you can see why a $3000 investment is necessary and will more than pay for itself. The Titan Z helps even more with its physics processing which allows the type of simulations we see in modern 3D effects, this is just rendering too.

You have the medical imaging and the more heavily intensive auto CAD applications which can benefit from it, rendering a full body scan in coloured and beyond HD detail 3D that can be manipulated, moved and zoomed through by a technician or doctor would cripple many graphics processors. So can full blueprint drafts in auto CAD, a $3000 GPU that can handle all that at those speeds is actually great value compared to the older $12,000 solutions a few years back.

Aaron Sylvester:
but they are only throwing "gaming" in there to spread market awareness faster.

If you ask me they have only added that in to try and pee on AMDs parade with their R9 295x2 by shouting "we have the most powerful dual GPU card on the market", because I might be wrong but I swear this "4K gaming" idea wasn't present before the launch of that card.

Saulkar:

Jingle Fett:

Normally you'd be right (about the separate 6gb caches) but in this case I believe it does mean 12 gb! This is thanks to the new pascal architecture's unified memory. Unless I'm mistaken (and admittedly I could be), unified memory means the two 6gb titans inside the Titan Z are able to share the memory, along with your computer's ram.
Meaning, with non pascal dual cards if you have two cards with 6gb each you still only have 6gb. But with unified memory it all gets pooled together and you get 12gb instead.
I think this is why the Titan Z is more expensive than buying two separate Titans and I'm pretty excited (I do VFX/3d work too)

http://devblogs.nvidia.com/parallelforall/unified-memory-in-cuda-6/

I will admit that I did not understand all of what I read in that link but most of it came off as unifying the shared data and paths between the CPU and GPU RAM. Not the separate cache of ram on a single GPU, though still an interesting read.

Most of the sources I looked at state that the memory is split down the middle for each GPU.
http://hothardware.com/Reviews/NVIDIA-Outs-Pascal-GPU-Titan-Z-NV-Link-3D-Memory-Tegra-Updates-at-GTC/?page=2
https://en.wikipedia.org/wiki/GeForce_700_Series
http://www.guru3d.com/news_story/geforce_titan_z_has_12gb_graphics_memory.html
http://www.game-debate.com/hardware/?gid=2105&graphics=GeForce%20GTX%20Titan%20Z

Would be foaming at the mouth and robbing grannies and their cats on the streets if this card turns out to have unified memory.

Haha that's my reaction too lol and looking some more I might have jumped the gun a bit and was partly seeing what I wanted to see. I'm pretty sure the unified memory works the way I think it does, but I think the mistake I made is that the Titan Z isn't pascal after all :( (which in my defense, almost every article out is talking about the new pascal gpu and the titan z in the same sentence lol). On the plus side though, we all know it's just a matter of time before Nvidia makes a pascal-based titan :D

I didn't understand a lot of the stuff in the link I posted either but I could have sworn I'd read somewhere else that the unified memory would work the way I was saying. The part that jumped out at me was the part called "Example: Eliminating Deep Copies".
If the memory is separate then the computer needs to take say your 3d model that's in your ram and then copy it over to the GPU's ram (so you get the 6gb total because the asset needs to be copied to each gpu) and if the original is larger than those 6gb then you're stuck...but with unified memory the GPU is sharing its memory with the computer so you only need one copy, not one copy for each gpu and I'm guessing that applies no matter how many GPUs you have.

So THEORETICALLY at least, that's what I think that is what we'll be getting with pascal. Now I'm just bummed out we have to wait lol

If you dish out $3,000 for a video card, I don't think it should get to be obsolete, ever.

Good Lord. So you can play every game at 120 FPS for the next five years. And you can play it at 4K, and all the textures will look like crap, because with a few exceptions (Crytek will probably love it) companies aren't going to make their games optimized for 4K because the current generation (previously called the next generation) of video game consoles is struggling with 1080P.

Don't get me wrong, if someone wanted to give me one of these, I wouldn't turn my nose up at it. And then my power supply would no doubt look at me, laugh hysterically, and then explode in a shower of turquoise and amber flames. But unless you *are* doing some kind of Pixar-level 3D rendering work, there's really no way this is much more than a grotesque bit of conspicuous consumption.

so twice the power for thrice the price? no thanks.

DrunkOnEstus:

Also, 12GB of VRAM? The fuck?! This GPU has 50% more RAM in it than the PS4 has available for everything, and Watch Dogs is supposed to be pushing the envelope (if only because of poor optimization) by requiring 3GB for Ultra textures. Everything else about this card will become "obsolete" by the time its 12GB of VRAM is considered worthwhile.

the difference is, PS4 fails to even do 1080p properly, and on PC people that are going to afford this card are going to afford setups where they run EYEFinity on 3 monitor setups having 3x4k screens running the game. thats 12 times more pixels. so a game that would require 1 GB memory generation for pixels on PS4 would require 12 GB here.
Becuase yes, PC gamers can do that far ahead. now, the average PC gamer are going to run setups that arent even twice the pwoer of PS4 of course, but they aren going to run this card either and instead uses something like 200 dollars 760.

Slegiar Dryke:
-Designed for 4k gaming
yep I'll never even consider one then. I play too many older games still that extreme resolution would wreck. baring the price as well, yeeeeaaahh......

um what? old games look glorious on large resolutions! have you seen zelda in 4k? it almost looks modern.

Charcharo:
Really?
TotalBiscuit has problems with Sli Titans...

I myself too have problems on a GTX 760, which even if weaker, is ENOUGH for Metro Last Light and Lost Alpha, games that are prettier/more advanced then Watch Dogs

he doesnt have problems with SLi Titans. he has problems with badly optimized games that dont support SLI. there is no game out that cant be run on ultra on SLI titans if SLI is supported.

tell me more about 760 performance with Watch Dogs though, i also have 760 and want to play it.

Azure Knight-Zeo:
Who is this for?

this is for enthusiasts who need a lot of video memory ram to run MASSIVE resolutions and have enough money to afford it. It is also for companies that do a lot of graphical processing as this turns out to be better at it AND Cheaper than specific industrial GPUs.
a regular gamer will not buy this card. its not the target audience.

viranimus:

Edit: Seriously. I just ran through Risen 2 in triple wide while running winamp milkdrop visualization for desktop mode on a non linked single 1gb 650 and a dedicated Ion card for PHYSX and never dropped below 60fps.

Also... for "industry" would they not go with Work station cards such as the quadro line long before looking at this?

i bolded the part of why your experiment is irrelevant.
heck, 260 runs that game on ultra.

and for industry, turns out that this card is better price/power ration than work station cards.

Strazdas:
so twice the power for thrice the price? no thanks.

DrunkOnEstus:

Also, 12GB of VRAM? The fuck?! This GPU has 50% more RAM in it than the PS4 has available for everything, and Watch Dogs is supposed to be pushing the envelope (if only because of poor optimization) by requiring 3GB for Ultra textures. Everything else about this card will become "obsolete" by the time its 12GB of VRAM is considered worthwhile.

the difference is, PS4 fails to even do 1080p properly, and on PC people that are going to afford this card are going to afford setups where they run EYEFinity on 3 monitor setups having 3x4k screens running the game. thats 12 times more pixels. so a game that would require 1 GB memory generation for pixels on PS4 would require 12 GB here.
Becuase yes, PC gamers can do that far ahead. now, the average PC gamer are going to run setups that arent even twice the pwoer of PS4 of course, but they aren going to run this card either and instead uses something like 200 dollars 760.

Slegiar Dryke:
-Designed for 4k gaming
yep I'll never even consider one then. I play too many older games still that extreme resolution would wreck. baring the price as well, yeeeeaaahh......

um what? old games look glorious on large resolutions! have you seen zelda in 4k? it almost looks modern.

Charcharo:
Really?
TotalBiscuit has problems with Sli Titans...

I myself too have problems on a GTX 760, which even if weaker, is ENOUGH for Metro Last Light and Lost Alpha, games that are prettier/more advanced then Watch Dogs

he doesnt have problems with SLi Titans. he has problems with badly optimized games that dont support SLI. there is no game out that cant be run on ultra on SLI titans if SLI is supported.

tell me more about 760 performance with Watch Dogs though, i also have 760 and want to play it.

Azure Knight-Zeo:
Who is this for?

this is for enthusiasts who need a lot of video memory ram to run MASSIVE resolutions and have enough money to afford it. It is also for companies that do a lot of graphical processing as this turns out to be better at it AND Cheaper than specific industrial GPUs.
a regular gamer will not buy this card. its not the target audience.

viranimus:

Edit: Seriously. I just ran through Risen 2 in triple wide while running winamp milkdrop visualization for desktop mode on a non linked single 1gb 650 and a dedicated Ion card for PHYSX and never dropped below 60fps.

Also... for "industry" would they not go with Work station cards such as the quadro line long before looking at this?

i bolded the part of why your experiment is irrelevant.
heck, 260 runs that game on ultra.

and for industry, turns out that this card is better price/power ration than work station cards.

Bad optimization is EXACTLY what I was talking about.
Watch Dogs is badly optimized and even ONE Titan should be more then enough.

As for 760 performance:
It is good. Plays on Ultra with HBAO + and SMAA (fuck TXAA, TotalBiscuit does not know his tech). Problem really is stuttering. And it happens on all settings.
It also happens on my weaker 5770, where I do medium settings (again, considering how the game looks it should be higher).

I do hope its just some stupid bug/driver shit and it is fixed.

Charcharo:
As for 760 performance:
It is good. Plays on Ultra with HBAO + and SMAA (fuck TXAA, TotalBiscuit does not know his tech). Problem really is stuttering. And it happens on all settings.
It also happens on my weaker 5770, where I do medium settings (again, considering how the game looks it should be higher).

I do hope its just some stupid bug/driver shit and it is fixed.

ah, if it plays on ultra its all good then. anti-aliasing is not mandatory for me, its just icing to the cake.

Yeah driver updates and game patches will try to patch some things up, no idea if it will patch stuttering as for example it never did in Mafia II and even on 760 where you can easily power through the stuttering just out of sheer power of GPU (but very visible on recommended GPUs) its still kinda annoying. especially when driving at high speeds.

Looks like more than a few people here didn't bother to read who this is targeted at scientists and researchers, and gaming enthusiasts, particularly those looking at form factor cases. Give it 5 years, half the price, 7 less than that, how will consoles compare then? Still wringing of hands praying for the magical software fix to save them from the jaggies and stutter? Or maybe Microsoft and Sony to start allowing cutting corners to mask poor performance, just like consoles last gen?

Strazdas:

Yeah driver updates and game patches will try to patch some things up, no idea if it will patch stuttering as for example it never did in Mafia II and even on 760 where you can easily power through the stuttering just out of sheer power of GPU (but very visible on recommended GPUs) its still kinda annoying. especially when driving at high speeds.

Mafia II by default enabled advanced cloth physics on every character and back ground NPC in view of the player. Looked amazing when you noticed it, but was bringing power house systems to their knees. There was a tweak you could do in the .ini file, just disable the cloth physics on all the women's dresses, the effect was very subtle anyway, so there was not really a huge difference with it disabled.

..... Fuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuck.

That's all I can say. What a monster. If only I could blow $3k into the wind, I'd pick one up along with a 4K monitor. Would never see sunlight again.

So anyway my birthday's coming up...

Well I'll never have one...and rightly so because I'd probably take it out of the case every night just to sleep beside it.

(3 grand y'know... gotta get my moneys worth)

Strazdas:

um what? old games look glorious on large resolutions! have you seen zelda in 4k? it almost looks modern.

uhh, yeah, dunno what you've been playing it on, but A: I don't have anything 4k capable, B: last time I tried playing skyward sword it took some fighting with my tv so it didn't artifact around the the textures on a 1080p tv, and C: I don't particularly care if it looks "modern", because "modern", imho, looks most of the time like crap. and yes, I am a massive nostalgic, so color me subjectively biased ^^;;

Slegiar Dryke:

Strazdas:

um what? old games look glorious on large resolutions! have you seen zelda in 4k? it almost looks modern.

uhh, yeah, dunno what you've been playing it on, but A: I don't have anything 4k capable, B: last time I tried playing skyward sword it took some fighting with my tv so it didn't artifact around the the textures on a 1080p tv, and C: I don't particularly care if it looks "modern", because "modern", imho, looks most of the time like crap. and yes, I am a massive nostalgic, so color me subjectively biased ^^;;

4k TV and an emulator does that. The point wasnt whether you are 4k capable or not, the point was that increaisng resolution does not make games look any worse but rather the opposite. Not sure what kind of artifact around the textures you saw, i never saw anything like that.
The point wasnt modern or anything, the point was that large resolution on older games does not make them look worse, but in fact make them look better.

heres a couple screenshotts of windwaker in almost 4k
http://i1.minus.com/ibb6jzr1xwloqe.png
http://i7.minus.com/igi9rAXi0WpIp.png

P.S. personally i dont own 4k TV/monitor either, but i am lucky enough to know somone who does and can compare to my regular 1080p (because im poor :( )

Strazdas:

4k TV and an emulator does that. The point wasnt whether you are 4k capable or not, the point was that increaisng resolution does not make games look any worse but rather the opposite. Not sure what kind of artifact around the textures you saw, i never saw anything like that.
The point wasnt modern or anything, the point was that large resolution on older games does not make them look worse, but in fact make them look better.

heres a couple screenshotts of windwaker in almost 4k
http://i1.minus.com/ibb6jzr1xwloqe.png
http://i7.minus.com/igi9rAXi0WpIp.png

P.S. personally i dont own 4k TV/monitor either, but i am lucky enough to know somone who does and can compare to my regular 1080p (because im poor :( )

If I remember correctly, at particular settings on the tv, there was tearing around the edges of objects, like a rotating rupee left a static trail in the wake of its rotating edge, and just overall it didn't look as good. Think I solved it partially by rock bottoming the sharpness almost and fiddling with a few other things.

and....I'll be honest, I can't see any difference between those pictures and gamecube level =/ granted maybe in part its something you actually have to see on a screen capable of such....but I'd imagine it would still provide a noticeable increase in picture quality and I can't see it. thanks for trying to show me though =)

Slegiar Dryke:

If I remember correctly, at particular settings on the tv, there was tearing around the edges of objects, like a rotating rupee left a static trail in the wake of its rotating edge, and just overall it didn't look as good. Think I solved it partially by rock bottoming the sharpness almost and fiddling with a few other things.

and....I'll be honest, I can't see any difference between those pictures and gamecube level =/ granted maybe in part its something you actually have to see on a screen capable of such....but I'd imagine it would still provide a noticeable increase in picture quality and I can't see it. thanks for trying to show me though =)

Such artifacts is called ghosting and is caused by your TV trying to guess where the image is going (which is why low framerate games look better at high frame rate TVs, your TV is trying to guess where stuff is going and you actually see the in between frames as interpreted by TV. now, when that happens badly you see ghosting whether by TV overshooting or undershooting its guessing for next frame. This can be seen if your monitor has response time options, turn it to "high" and try scrolling, all letters will start ghosting as your TV is trying to guess letter position and overdoing it/underdoing it. For best gaming experiences is adviced to always turn off such guessing, however not all machines allow that. Setting it to "Average" seems to make it around right for my monitor personally, but of course each persons TV/monitor may act differently.
This is not caused by resolution, but by the TV and its (bad) (default) settings.

Yeah, seeing it in action is a whole different game than seeing screenshots. and yeah zoom in to the "original" quality because your browser is likely compressing it down if your monitor isnt in 4k to see how it actually would look. Sadly i cannot show any video, for one, i do not know where to find one, and for two your not going to find uncompressed videos on the internet and compressed videos really does not do graphics justice (which is why for example 1080p game in reality looks vastly better than on youtube).

I see people here going back and forth about if the Titan Z is targeted for gamers or not. Well, Spec-wise and feature-wise, it completely should not be....BUT it is in the ''GeForce'' line of cards, which is completely Gamer focused. So no, it doesn't make sense to be aimed at Gamers. There's one class of gamers it is aimed at, the ones with money to buy for bragging rights. Other-wise it doesn't make sense.

 Pages PREV 1 2

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here