First impression on unpacking the Q702 test unit was the solid feel and clean, minimalist styling.
Geek 101: a graphics card primer
- — 04 September, 2009 00:05
Drivers, Drivers, Drivers
No matter what graphics processor you have, you need the latest drivers. For Nvidia cards, go here. For ATI cards, go here. For Intel integrated graphics, go here. If you have a notebook, you may need to go to your notebook manufacturer's Web site to get the latest drivers.
Microsoft's marketing department is doing its best to brand DirectX 11 as a Windows 7 thing, but the truth is that it's coming to Vista as well. This new version of the API brings with it several new features. It's too much to go into here, but the short list is:
- Better use of multi-core CPUs
- Tessellation - This is the fancy word for breaking up an object made of a small number of triangles (and thus blocky-looking) into a very large number of triangles, which can then be manipulated to make the object look smoother or more detailed.
- DirectCompute - (aka "Compute Shaders") Like OpenCL, this is a standardized way to make and GPU with DirectX 11 drivers do general computational stuff.
CUDA and ATI Stream
For the past several years, both Nvidia and ATI have been working on using the GPU for general computing tasks. It's hard to launch a new software industry. Each company has its own proprietary means of programming its graphics products. Nvidia's is called CUDA, ATI's is called ATI Stream. CUDA is more popular, but it's still mostly stuck in the "big iron" high performance computing and academic fields, with only a handful of real consumer apps.
New programming models, such as using the GPU for general computing tasks, tend to take off when standards emerge, so the real action will probably be in OpenCL and DirectX 11 Compute Shaders. Don't let CUDA or ATI Stream influence your buying decisions too much.
SLI and Crossfire
These are terms for Nvidia (SLI) and ATI (Crossfire) technologies to use more than one GPU at a time for higher performance. Should you get it? Generally speaking, this is one of those "if you have to ask, the answer is no" sort of technologies. You can expect a second GPU to add maybe 50-80% performance over the first, and from there the performance gains are minimal. The third GPU only gets you maybe 30% more, and the fourth (yes, you can do a four-GPU system!) barely improves things over the third at all.
Enthusiast gamers with very big, high-resolution monitors are the target market for multi-GPU solutions. If this is you, you might want to consider SLI or Crossfire. You'll need a motherboard with two graphics slots that supports SLI/Crossfire, but these are not uncommon. Odds are, most of you reading a "Geek 101" article probably aren't the target market for this.
Future Hardware: Nvidia, ATI, and Intel's Larrabee
Both ATI and Nvidia are getting their new DirectX 11 class graphics products ready to roll. ATI appears to be a few months ahead of Nvidia on their rollout. If the rumors are to be believed, the company should have a top-to-bottom lineup in the next month or two. Nvidia may only have high-end chips at first, at then only at the end of the year or possibly early next year.
Unfortunately, we can't tell you which one is the better buy because we don't really know about their price, performance, power utilisation, or any of that other stuff. But if you don't desperately need a new graphics card right now, you might want to wait a few months and see how this new generation of products looks.
Meanwhile, Intel is preparing a novel new product with the code-name Larrabee. This will be a GPU first appearing in a high-end discrete graphics card rather than the typical integrated graphics stuff we see from Intel. It doesn't follow the traditional graphics chip architecture, but is rather a chip full of lots of very compact x86 CPUs (like the Atom chip for netbooks) that have very wide vector processing units and a specialized set of programming instructions.
This makes the chip very flexible, and it should be great for GPU compute type applications, but will it be a fast graphics chip? Nobody knows. What we do know is that Intel is a year ahead of everyone else on chip manufacturing technology and should never be underestimated.
Next Page: GPU shopping tips...