Experienced PointsWhat Makes Gaming Hardware Become Obsolete?Experienced Points - RSS 2.0
What Makes Stuff Obsolete?
Computer evolution is generally depicted as a roughly linear thing: The hard drives get a little bigger, the memory gets bigger, the CPU gets faster, and the GPU gets faster. But in reality it's not nearly this simple, which is what makes Daniel's question so complicated.
Last week I mentioned that your processor is actually several processors (called cores) all packed together. The game developer needs to divide up their game into several threads to put those extra cores to use. Where it gets tricky is that some processors have a different number of cores and some processors run at different speeds. Is it better to have two really fast cores, or four medium-speed cores? It depends on the game you're trying to run and how many threads it has.
Consider two games: Shoot Guy is a shooter with static scenery. The walls and decorations don't move, there's very little in the way of physics objects flying around, and you're never facing more than a couple of foes at a time. There's not a lot of processing to be done, and most of it will be fairly serial. The other game is ShootCraft, an RTS with tons of flying debris, destroyable environments, and dozens or even hundreds of combatants running all over the place.
Shoot Guy might benefit from a processor with only a couple of fast cores. ShootCraft might do better on a machine with lots of cores, with less of a need for raw processor speed. If Shoot Guy runs beautifully on your machine but ShootCraft runs like an asthmatic mule, then some people will claim the ShootCraft programmers are bad at their jobs, but that's not always the case. Even if two games look visually similar they can have very different needs. They differ not just in how much power they need, but in what form that power takes: Cores or clock speed.
Over the past decade or so, the number of cores and the speed of the processors have climbed gradually, but not always at the same time, so it's hard to say when a particular CPU will be officially Too Old to play the latest games.
The other thing that might keep a game from running is memory. Usually the programmer will pick a target for their game: "Shoot Guy will have a three gigabyte memory footprint", meaning the game will require that much memory to run properly. This is more of a hard limit. If your processor isn't up to the job, the game might be sluggish but playable. But if you don't have the memory the programmer intended, then the game will either refuse to run outright, or will try to run and crash.
Even more complex is the question of when a graphics card needs to be updated. Graphics cards aren't just getting faster, they're becoming more complex and robust.
Let's say game developers devise this sexy new rendering effect called "bling mapping". It's a bit of a strange hack that involves a bunch of crazy math, a few rendering tricks, and some special art, but when it comes together it looks amazing. It catches on. Even though it's a complex pain in the ass, other developers copy the technique because it looks so good.
Then NVIDIA sees this, and they add a feature for their next round of graphics cards that supports bling mapping directly. Developers don't have to do this goofy hack and mess around with all of that extra work and CPU-hogging math. With just a couple lines of code they can have bling mapping.
The developers of Shoot Guy 2 are grateful. They remove the complicated old hack they came up with and implement NVIDIA's new bling mapping.
This is great for the developer, because code complexity is a real problem and simplifying things is a great way to speed up development, make the game run faster, and cut down on the number of bugs. But this sucks for the poor gamer at home. Now you need one of NVIDIA's new cards for bling mapping to work. Even though bling mapping worked on your old graphics card in Shoot Guy 1, this new game will require an upgrade. Note that this has nothing to do with how fast the graphics cards are.
If you're lucky, the developer will make bling mapping an optional thing so you can still run the game. If you're not lucky, you'll need an upgrade, even if Shoot Guy 1 and 2 look roughly similar and your old graphics card should technically be fast enough to handle the rendering load of Shoot Guy 2.
You could see this in the mid-aughts with things like normal mapping and HDR lighting. You'd see it in system specs of games, where they warned that you'd need a card that "supports 2.0 shader model or later". Those marked points in graphics evolution where new features were added to the hardware, cutting off the older generations.
As complex as all of this is, remember that this is a super-simplified view of things. Technology isn't a straight line, but a meandering climbing path with cutbacks, loops, and dead ends. This is one of the reasons consoles are so popular. People that already know this stuff tend to take it for granted, and underestimate how much of a hassle it is for the uninitiated to figure out what they need to buy for their PC and when they need to buy it.
(Have a question for the column? Ask me!.)