Written By:
Date Posted: October 1, 2001

The nVidia GeForce 3 is among us, bringing in it's promised speed and image quality improvements with it. Not since the 3dfx Voodoo 1, and the nVidia TNT, has there been as much excitement in the video card market as there is now. The Voodoo was the first real 3D accelerator, and the nVidia TNT was the first to make fast 32Bit gaming a reality. Newer products from them since then, have simply been speed adjustments. Yes, the Matrox G400 had dual monitor support and Environmental Bump Mapping, the original nVidia GeForce had the Transform and Lighting (T&L) engine, and ATI's Radeon had the impressive Charisma engine, but none of them has impacted the gaming industry as the Voodoo and TNT have. Today, all the above (well, maybe not Matrox, and for sure never again from 3dfx) produce ultra fast cards, and both ATI and nVidia feature integrated T&L in their chipsets. As fast as these cards are, they don't do much to convince us we're living an experience, instead of just playing in it.
Like previous nVidia based video cards, there is a wealth of different manufacturers for you to choose from. Typically, you'll want to buy from a well known distributer. Driver and technical support is always more readily available than the fly by night ACME Corporations. You pay more for a brand name, but will get a better supported product. This has changed since nVidia began releasing it's unified reference drivers. Now, it didn't matter who you bought from, since drivers, minus the driver support, from nvidia worked for any card that had their GPU on it. With the sheer number of nVidia users and online forums, tech support is easy to find. I still try to buy brand name hardware though, since usually the overall build quality tends to be better than generic, like bigger heatsinks, or extras like TV-Out, but you will pay more for this.
The subject for torture testing today is the MSI StarForce 822 GeForce 3. I'm sure many of you are familiar with MSI as a motherboard manufacturer, but probably not as much as a video card maker. Surprisingly enough, they've been making video cards for a few years now, starting with the TNT series of nVidia's chipsets. They've always made quality motherboards, the MSI-6119 BX is my all-time favorite, and they're pretty solid with their VGA offerings. Earlier builds of the StarForce 822 had some issues like older revisions of the GeForce 3 GPU (GF3). While most used the A5 revision, which is more stable and complete, MSI used the A3 revision in it's rush to be among the first to market. Their earlier cards also lacked heatsinks. In their defense, it seems most of these older cards were only in the hands of testers and reviewers mostly. A few may have slipped to market, but the one we're going to look at is the one you'll likely find, and thankfully, all previous issues are resolved. Let's start with the review.
Specifications
nVidia® GeForce 3 Chipset
64MB DDR SDRAM
4X AGP
256-bit Graphic Architecture GPU
57 Million Transistors
800 Billion Operations Per Second
7.4GB/sec Memory Bandwidth
200MHz Core Clock
460MHz Memory Clock
Supports Windows® 9x, ME, NT, 2000
TV-Out, Video-In & DVI
True, reflective bump mapping
High-performance 2D rendering engine
High-quality HDTV/DVD playback
API support: OpenGL 1.2 and lower, DirectX 8.0 Version and lower
GeForce 3 Specific
nfiniteFX" engine for full programmability
Lightspeed Memory Architecture for unmatched performance
Surface engine for high-order surfaces and patches
Programmable Vertex Shader
Programmable Pixel Shader
HRAA: high-resolution antialiasing
Integrated hardware transform engine
DirectX® and S3TC texture compression
Dual cube environment mapping capability
Hardware accelerated real-time shadows

So, what does the GeForce 3 promise? Like I said, speed and image quality. I know that many online tech sites, as well as testing here, report that the GeForce 3 isn't terribly faster than the GeForce 2. In fact, in some cases, it's slower. You'll notice that although the memory speed is equal to that of the GeForce 2 Ultra, but the core speed is actually 50mhz slower. What this will mean that for many of todays games (Direct X v7 games, and most pre August 2001 games), the GF2 Ultra will be faster at resolutions of 1024x768 and lower. There is much more to the story though, and we'll get to that later on.
Although the only APIs listed are OpenGL 1.2 and lower, and DirectX 8.0 and lower, newer DirectX versions will still run on the GF3. Believe me, I get this question a lot. The video card will work, but any new features just won't be supported in hardware.

I opted for the TV-Out version, which seems to be an option from what I've seen. The vanilla version is about 30$ less. I'll admit, I don't really know why I bothered since I hardly use it. In fact, I only tested it just to see how it worked. It doesn't help that the reference drivers disable this feature, and TV resolution sucks anyhow. The picture was adequate, though a little dark in my opinion. I didn't test it on a high end TV, so that may have affected my picture. Anyhoo, I don't think many play PC games on their TV, but I figure it'd come in handy one day if I demote the StarForce to an entertainment system where I want to pipe DVD movies to my TV. Or, I coulda put that 30$ savings towards a set top DVD player. doh!
Next