If you have any interest in PC gaming outside of watering virtual rhubarb in FarmVille, you probably need to equip your rig with a graphics card (also called a GPU). Modern titles like Crysis 2, Battlefield 3 and Call of Duty Modern Warfare 3 will chug along on integrated graphics that came on your motherboard, or not even load at all.
Yet GPUs are one of the most complex, jargon-filled products available. How can you make sure you get the most bang for your buck without getting misled by marketing claims, or worse, by evangelical enthusiasts who believe everyone needs to drop $500 on the latest and greatest? Just follow along with your resident Vigilant Geek.
What does a graphics card do?
Think of the graphics processing unit (GPU) at the heart of a graphics card as a sort of translator: It takes data coming from the CPU and transforms it into imagery. More complex visuals, such as high-definition games with all the bells and whistles turned on, require more complex GPUs in order to keep up with the firehose of data generated by modern games. High-end graphics cards can also significantly cut back on the time it takes to complete photo and video editing tasks, since more expensive GPUs have more processing cores, which allows them to process more computations at once.
Do you even need a graphics card?
If you don’t play games or edit graphics and have a modern computer, the answer is likely “No.” The processors found in modern-day PCs pack integrated GPUs that can handle casual 2D gaming and even high-definition video playback just fine. As a general rule of thumb, most computers bought in the past two years that have an AMD APU, or an Intel Core i3, i5 or i7 processor should be up to snuff for average graphical loads. Older computers may need a graphics card to watch HD movies, but you should be fine with a low-end model costing $100 or less.
Modern integrated graphics may be able to play 1080p HD movies, but they don’t hold up well during 3D gaming. You might be able to pull off playable frame rates in older, less intense titles like World of Warcraft, but if you want to play more recent games you’re going to need a discrete graphics card.
And, if you are going to buy a discrete graphics card, there are two things to look out for to make sure you’re not overpaying to fill your needs.
Do you need the latest and greatest graphics cards to play games?
Each generation of graphics cards stays at the forefront for a year or so before being replaced by newer, better, faster cards. The current generation of graphics cards — the AMD Radeon HD 7000 series and the Nvidia GeForce GTX 600 series — sport a new GPU architecture and vastly enhanced energy efficiency over previous models. Does that mean you should focus on modern cards alone in your quest for kick-ass graphics?
Not at all.
Buying last-generation cards is a great way to save some cash on a graphics card purchase. Now that the GTX 600-series is out, you can pick up a GTX 570 for between $250 and $300, or a former top-of-the-line GTX 480 for under $200. Those cards run hotter, draw more power and aren’t quite as speedy as, say, the newer GTX 670 (which costs more than $400), but they are still more than capable enough to play modern games at very good frame rates on a single display. Power gamers might not be completely satisfied, but average gamers should be.
I’d get nervous buying video cards more than a generation or two old, however.
One issue with buying older cards is that they’ll be even longer-in-the-tooth when the next generation of graphics cards and top-end games appear, but they’ll also be even cheaper (assuming they’re still available). If performance is a concern, you can drop some cash for a second graphics card of the same type and run a dual-card setup that increases frame rates — and generated heat– by a significant amount. Dual-card setups can run into stability issues from time to time, however.
Another possible issue is that older cards may not have the same features built into current cards. If you have a special need, like multiple monitors, do some homework and make sure the card you’re looking at supports it.
A potential pitfall when buying graphics cards
For the most part, the specs on graphics cards go fairly linear: The bigger the number, the better the performance. The one spec that could possibly give gamers a headache relates to the GPU’s memory. (Memory is also called RAM.) First off, you’ll want at least 1GB to 1.5GB of GPU memory to play modern 3D games, and more is preferred if you’re playing top-end games like Battlefield 3.
That part’s simple; the tricky part is making sure you have enough memory bandwidth to process and display those games smoothly. Memory bandwidth determines how fast the GPU can access your card’s RAM buffer, which is determined by your card’s memory clock speed and memory interface.
Think of memory bandwidth like a highway. Wider lanes — a wider memory interface — mean more traffic can stream down the highway simultaneously. A higher memory clock speed means that the traffic travels faster. In a nutshell, a higher overall memory bandwidth means that traffic is less likely to jam up and more likely to reach its destination quicker. That’s a good thing.
Gamers should stick to cards with at least a 192-bit memory interface, though 256-bit (or higher) is even better for demanding games, especially if you plan to turn on the bells and whistles and jack up the detail settings. Large textures and antialiasing utilize a lot of memory. Also, look for a card with GDDR5 memory, which is far faster than the DDR3 RAM found in older graphics cards.
If you’re looking for a card for a home theater PC that won’t be playing games, it’s OK to look for lower specs. For instance, the $110 Radeon HD 7750 is a great low-power HTPC card with just 1GB of memory and a 128-bit bus. It will struggle to play anything but the most basic games, however.
The memory interface and bandwidth are crucial specs that many everyday buyers don’t even know about, and manufacturers sometimes take advantage of your ignorance by offering cards with a decent amount of RAM, but a skimpy interface and clock rate. All the RAM in the world won’t make a difference if the information hits a bottleneck during transfers. Virtual traffic jams suck just as hard as physical ones.
What graphics card should the average person buy?
If you’re sticking to a single monitor of standard resolution, a mid-range graphics card costing between $200 and $250 (such as the Radeon HD 7850 or GTX 560 Ti, especially if you can find a GTX 560 Ti with 448 CUDA cores) is the price-to-performance sweet spot. One of these cards should be capable of playing virtually all games with smooth frame rates — though you may need to switch away from the highest detail settings in barnburners like Battlefield 3 or Crysis: Warhead.
If you can spare a couple of bucks more, the Radeon HD 7870 offers a fairly decent jump in frame rates over the aforementioned models for around $260, which is comparable to the GTX 570 released by Nvidia last year — both in price and performance.
If, on the other hand, you have a little less money to spend, the $140 Radeon HD 7770 can also play modern games but you’ll need to stick to modest graphics settings and steer clear of antialiasing options, which put a big drain on frame rates. The older, more power-hungry Radeon HD 6850 and Radeon 6870 fall into the same boat.
You should only really consider straying into graphics cards costing $300 or more when you’re running a multi-monitor setup, or demand to see the best possible experience and silky smooth frame rates on a high-res display. When you’re in this price range, the Nvidia GTX 680 is the best of the bunch.