ATI CARD ROUNDUP

 Pages 1 2 3 NEXT
 

ATI CARD ROUNDUP

A comic that fits your budget.

Read Full Article

They're cost-effective as well. Just leave open your PC case and it'll double as a space heater!

TopazFusion:
Oh wow, this won't rustle people's jimmies at all!

Also, I bet you were tempted to drop another penis joke into this strip somewhere.

They did, it's just too small to see with the naked eye. Hurr hurr.

In addition to that, their CPUs can heat up your house on those cold winter nights.

EDIT: I just realized you called them ATI. Why not AMD?

HEY!

nah, I lol'd. At least consoles have less performance.

Meh, in the budget to medium range (£100 to £200) manufacturer scarcely matters. Just go to some gamer site and buy whatever has the longest bars on the chart.

That's a burn you can normally only get from a card that's "meant" to be running at 95 degrees Celsius!

Meanwhile, us console peasants have no fucking idea what you're talking about.

My 270X runs about 160-170F when playing Skyrim with the hi-res texture packs.

Sure, it's saved me about $50 a month on home heating costs, but it plays every game in my Steam library on the highest settings with no problems at all.

Then again, I'm not a 360noscoping CoD-Strike FPSer. So it's not like I need 240 frames-per-second just to remain competitive.

Hey, I live in a cold country and happen to like having two graphic cards running at 90°C. I don't have to run any additional heating in my home, I even keep a door or window open to keep the room temperature down ^_^

You know I don't think I'm ever going to be able to get into custom built PCs. Too much hassle and I am far too lazy. The day I finally trade in my mac I'll just buy a laptop with decent specs.

Ouch that hurts. I do really like AMD CPUs over Intel for the sole fact that for the price of a low end Intel chip I can get the top of the top end chip from AMD. As far as GPU though, I haven't taken the time to learn about AMD GPUs so I stick with what I know.

Equal opportunity hate please.

bahaha. Give it a few years and it'll all reverse again.
Just like it has been doing on and off for the last 15 years or so...

Doom972:
In addition to that, their CPUs can heat up your house on those cold winter nights.

EDIT: I just realized you called them ATI. Why not AMD?

I've had ATI cards in my system so long I call them that out of force of habit. Just feels absurdly weird calling them anything else.

That heat stuff is also unbelievably ironic, and again shows how much things change.

I still have a 5770 in use. At the time of it's release it had the lowest idle temperature of any graphics card in it's class, and the lowest temperature/performance ratio (and best cost/performance ratio) of anything that existed at the time.

Of course, if you go back even further you find a time when ATI cards were so terrible nobody wanted them anywhere near their computers...

There were also several years running (from about 2006) where the highest performance cards were always ATI cards, and NVIDIA struggled to ever win out over them...

Things just go round and round in circles.

Though I guess nothing quite matches a Titan in terms of price tag, so if that's something to brag about... Sure. XD

erttheking:
You know I don't think I'm ever going to be able to get into custom built PCs. Too much hassle and I am far too lazy. The day I finally trade in my mac I'll just buy a laptop with decent specs.

These days, putting the hardware together is the easiest part. Users don't have to worry about IRQs or setting DIMM switches anymore. You'll likely spend far more time picking out what hardware you want and need than it will take to assemble.

The biggest part of building a custom PC is installing the OS and making sure you have all the drivers loaded. My last PC took 1-2 hours for the hardware and 12 hours for the software (not including reinstalling my Steam library, which is still ongoing).

If you're really lucky, you might even get correctly-working drivers!

P.S. Thanks

Looks like I've been relying too heavily on my friend from IBM for computer-related stuff. All the recent jokes have been flying way over my head.

erttheking:
You know I don't think I'm ever going to be able to get into custom built PCs. Too much hassle and I am far too lazy. The day I finally trade in my mac I'll just buy a laptop with decent specs.

Buy a desktop with decent specs. That's what I did. You usually get a lot more bang for your buck with desktops and you can upgrade the graphics card 2-3 years down the line.

"Oh! I just realized! This month I don't have enough money to support your Patreon." - ATI card users

CaitSeith:
"Oh! I just realized! This month I don't have enough money to support your Patreon." - ATI card users

Maybe the goal is to sink the Critical Miss ship by burning all readers bit by bit in order to elevate readership of their Patreon comic? Wouldn't that be a hilariously devious bit of PR?

Joking aside, everyone gets the burn when it comes to Critical Miss. At least they're equal opportunity burners. I'm sure the Patreon supporters know that this is how they operate. It has just been a bit much lately, but Critical Miss goes through waves of stuff like this.

As a poor person who uses a Radeon card (a really old one that sorely needs to be replaced but I'm too poor to do so), I'm offended!

And this is in no way related to the fact that I don't have a five-foot-long twin-shafted dongle.

shrekfan246:
As a poor person who uses a Radeon card (a really old one that sorely needs to be replaced but I'm too poor to do so), I'm offended!

And this is in no way related to the fact that I don't have a five-foot-long twin-shafted dongle.

I'm with you on that one, on both counts.

Oh to have a flashy nVidia card and five-foot-long twin-shafted dongle.. a man can dream.

I loved it :D

I used both cards in the past. But these days I don't trust Nvidia as far as I can throw them. That 970 fiasco was hilarious. And then the fact that G-sync is proprietary just rubs me the wrong way. Since AMD has fixed their frame latency issue I see no reason to use Nvidia anymore. I'm getting a Free-Sync monitor soon too!

You either admit you spent a shitload of money for fancy graphics, or that you can't afford to.

...

I spent a shitload of money. The fact that I gave it to NVidia is new, however, my last few flagship cards were AMD, back when they were still called ATI. In fact the 5850 in my old machine was probably the last card to still bear the ATI branding, before they stopped building cards themselves.

Oh my. I could feel the smug in this one.

And I loved it. I'm off to enjoy my GTX980 and this literal monster-cock of mine.

Nvidia makes better cards. They have way shittier business practices. Any software that AMD designs is pretty much open source. So I'll be waiting to see if they come out with something new.

WouldYouKindly:
Nvidia makes better cards. They have way shittier business practices. Any software that AMD designs is pretty much open source. So I'll be waiting to see if they come out with something new.

Sometimes... But one thing I learnt from the dev side of things (well, 2 actually) is ati and now AMD atruggle to write effective drivers.

And, crucially, even when technically ati has had better hardware, the fact that it isn't identical at a low level has hurt them.

Remember games with that "the way it's meant to be played" nonsense? That's Nvidia basically bribing game devs (with free hardware, in depth technical assistance, and sometimes even cash), to put in a lot of work optimising their games specifically for the way nvidia cards work, and for including proprietary, non-standard features that ati couldn't match not for techical, but purely for legal reasons.

They both tried to do this, but nvidia managed it way more often.
End result, large numbers of games optimised for the specific quirks of nvidia cards. Which then ran faster on nvidia than ati, not because of better hardware, but better optimisation. (which nvidia paid for)

Some things were also bad predictions. x1000 series cards had too little texture fill performance, and too much shader performance. Because ati expected this to be the future trend. But this never happened in the life of these cards, so you end up with something which was weak in the most important capability for games of that era, and strong in something that wasn't used much...

Still, they've always had better luck in the midrange. My 5770 was absurdly powerful for what it costed. Easily matched anything from the generation before it, even though it was only a midrange card...

Ha! Shows what you know. I'm very happy with the 4 gig of VRAM I got for a mere £200. You're probably looking at almost half that again for just 3.5 going Nvidia. Oh, I'm sorry 4GB. That's what it says on the box, right?

And all AMD asks in return is for me to put up with a little intermittent screen flicker at 60Hz.

Now ask yourself, who's really winning here?

Hi, I just spent a shitload of money.

Just ordered parts, waiting for them to arrive. I'll see how well I spent my money when the parts arrive. But for the research I did, I went AMD for the GPU - 295x2, liquid cooled to help it from melting into a pool of slag, and Intel for processor - I7 5930k. I'm normally the opposite of this - my current rig is an AMD Phenom and a 9800GT. but I consider myself an equal opportunity parts shopper and everything I've read puts the 295x2 ahead of Geforce with the Titan series being marketed as a thinking man's video processor, more of a supercomputer in a slot than a gaming device, so while still advertised as a gaming device, that's not its sole purpose. Even with the Z's dual setup, the X2 outpaces it, and at a fraction of the cost. Maybe if a dual-Titan X setup....but again that's $2k just for the graphics.

So, can't wait to put the new machine together. I got a few games that are simply unplayable on a 9800GT and others that I'd like to see at max settings. Next week is going to be interesting.

As a poverty line PC gamer. I approve of this comic.

Well that an as I tell people who brag... My AMD/ATi machine can match their Intel/nVidia machine in performance benchmarks and cost me like £500 less.

Think I went through 5 different cards before I just changed to Nvidia, never looked back.

Seriously though, all 5 cards had the same problem. After about a year or so the cooling fan started to come loose and it overheated constantly. I don't really care if this is a common problem for others or not, it's my experience and it sucked. That sound the fan makes when it is about to come loose... it's bad, and you know it's just a matter of time before it's dead.

So yeah I think mocking these cards is completely fair.

image

erttheking:
You know I don't think I'm ever going to be able to get into custom built PCs. Too much hassle and I am far too lazy. The day I finally trade in my mac I'll just buy a laptop with decent specs.

Don't ever do that... I mean, at least don't do it if you're using it as your main computer. Laptops these days suffer from some pretty bad cases of obsolescence, due to the fact that you have to literally dismantle them just to maintain them (or use compressed air, which only delays the inevitable.).

I hear that some brands are starting to get better with this, but I wouldn't jump heavy into laptops until it becomes the standard. Right now I do have a laptop, but only to act as a convenient compliment to my desktop, which I use for just about everything.

I recently switched between an Nvidia card to an ATI card (mainly because the retail store I went to didn't have the Nvidia card I wanted yet) and the mid-to-higher range stuff seems to be in the same league as Nvidia, in both performance and price. If anything Nvidia offers a bit more for their cards with the release of 900 series. AMD CPUs sorta have the more "good quality for the price you're paying" thing going on.

The only problem I have with ATI (or AMD, but I don't like to confuse things) is their naming conventions. R9 270X? Where the X actually means something over R9 270? Are you kidding me?

Ouch. My e-peen. It hurts. I have no preference but I do currently have an ATI installed as it was the best bang for my buck although it is 5 years old now so an upgrade is not out of the question....

Well it is funny. A negative spin on consumers who get way more performance for less money. It's funny.

 Pages 1 2 3 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here