A Noob’s Shopping Guide: Video Cards

May 15, 2013

If you ever dally in the world of PC gaming, there is one thing for sure: at some point, you will need to upgrade your video card.

It used to be that when you upgraded your Voodoo2, you simply bought a Voodoo3 (or maybe an Nvidia RIVA TNT2 if you wanted full 32 bit color).  But nowadays, looking at the feature list of a video card is like reading some strange sort of code.  But never fear, dear reader!  I will walk through the features of a video card on Newegg (the EVGA GTX 650 Ti 2 GB, if you are curious), and I will try and explain them so even the most basic enthusiast can understand what they are looking at.

image

Huh?  Which thing tells me how fast it is?  I want it fast.

The brand and model number don’t really matter.  I don’t know of any disreputable manufacturers of mid-high end video cards, so you’re fine buying from just about anyone.  The interface, on the other hand, does matter.  Unless your computer is an antique, you will be using a PCI Express (PCIe) interface.  This replaced AGP (which was slower), which in turn replaced PCI (which was slower still).  The x16 at the end refers to the number of “lanes” (how big the slot is that you are using).  This will almost always be x16, so just go with that and don’t worry about it.  The revision number (1.0, 2.0, and 3.0—4.0 is slated to be finalized in 2014 or 15) refers to how fast the video card can communicate with the rest of the system, and it matters in a big way.  Let’s say you get a GTX 650.  That uses revision number 3.0, which is the fastest.  But maybe you don’t have a motherboard or CPU that supports PCIe 3.0 (intel users, Sandy Bridge CPUs and backwards do not support PCIe 3.0).  The result is that you might be paying for power that you cannot use!  PCIe is backward compatible, so you can use a later revision and still get by just fine, of course.  Just know that you are missing out on performance that you paid for.

PCIExpress

Those big long slots are PCIe x16.  Don’t worry about the other ones, they don’t matter in this project.  Just be sure to use the PCIe slot closest to your CPU!

On to the chipset!  The chipset manufacturer actually does matter.  There are two major players in the world of video cards: AMD and Nvidia.  These two companies have gone back and forth as far as features and performance go, but at the moment, Nvidia is in the lead.  I recommend either a Fermi (GeForce 400-500 cards) or Kepler (GeForce 600-Titan) based board since these will support DirectX 11/ Shader Model 5.0.  If you choose to go AMD, the comparable cards are the HD 7000 series.

AMD-VS-NVIDIA

Sorry AMD, but right now Nvidia has the upper hand!

Don’t worry about clock speeds; this will not hold you up.  What will hold you up, however, are the number of CUDA cores/ Stream processing units (they’re the same thing, Nvidia and AMD just use different names).  These are the programmable shaders that I’ve waxed poetically on about in previous updates, and the more you have, the better.

cores

In this die shot, the boxes marked SMX are where your cores are (this is a high end Kepler chip, so they are CUDA).  There are 192 CUDA cores in each SMX unit, for a whopping total of 2880 CUDA cores.  Wow!

On to memory.  Don’t pay too much attention to memory clock speeds; the manufacturers manipulate these numbers to look as big as possible without telling you anything.  Unless you are into heavy overclocking, your factory default memory speed will be fine (if you are an overclocker, you probably don’t need this guide).  Regarding the quantity of RAM, more is always better.  Consider: the PS4 will have 8 GB of unified memory.  Assuming non-graphics related tasks use 2 GB of that memory, it still has a sizeable 6 GB of video memory.  I can’t overstate this: if you don’t want to have to upgrade within a year, go big on video RAM.  I recommend at least 2 GB.

Chimpanzee

I could have a picture of RAM here, but that’s boring.  So here’s a chimpanzee wearing people clothes.

The memory interface and memory type are other thorny memory-related issues.  The number for your memory interface (128 bit on the GTX 650 Ti) refers to the size of the memory aperture.  The higher this number, the faster information can go to and from your video RAM.  Newer does not always mean better here.  On the GTX 650, Nvidia actually reduced the size of the memory interface from the GTX 550 Ti  (which is 192 bit).  I’m not sure why this was done, but as far as memory access goes, the newer card is actually slower (of course, a GTX 680 is quicker than both, but it’s also ~$500!).  Do not buy a card with less than GDDR5; this is the actual speed of the memory itself, and is the industry standard.

gtx-680

Oh GTX 680, why are you so desirable, yet so far away from me?

Under APIs, you want to make sure your card supports the latest 3D APIs (otherwise you are buying something that is already obsolete).  The current APIs are DirectX 11 (technically 11.1, but no hardware revisions are required for 11.1 support) and OpenGL 4.3.

Final notes: Under ports, make sure you have the right port for your monitor (a DVI output can be converted to an older VGA one with a simple adapter, but an HDMI output cannot).  If you’re planning on CrossFire/ SLI (connecting two video cards together for faster performance), just make sure that CrossFire/ SLI is supported.  Also, be sure that you get two identical cards; this is required for CrossFire and SLI.  Finally, look at the power requirement (this will be measured in wattage) and the power connecter (probably 6 pin).  You want to make sure your power supply (PSU) can run the video card you are about to buy!  Otherwise, the rest of the stuff is marketing bull: all current Nvidia cards should support PhysX and 3D Vision, and all current AMD cards should support Eyefinity (you might need to turn down in game graphics for this if you get a lower end card).  Almost all video cards come with fans; the fact that they tell that on the website confuses me!

oldcard

This is a video card without a fan.  It is 9 years old.

Finally, before buying a video card, check out the benchmarks.  I’ve seen tech that looked great on paper but performed like crap in real life.  Why be a test guinea pig when someone else has done that for you?  Learn from the testing that others have done, and make a great purchase.  Happy shopping!

Tweet this!Tweet this!

Previous post:

Next post: