Written By:
Date Posted: October 1, 2001
The nVidia GeForce 3 is among us, bringing in it's promised speed and image quality improvements with it. Not since the 3dfx Voodoo 1, and the nVidia TNT, has there been as much excitement in the video card market as there is now. The Voodoo was the first real 3D accelerator, and the nVidia TNT was the first to make fast 32Bit gaming a reality. Newer products from them since then, have simply been speed adjustments. Yes, the Matrox G400 had dual monitor support and Environmental Bump Mapping, the original nVidia GeForce had the Transform and Lighting (T&L) engine, and ATI's Radeon had the impressive Charisma engine, but none of them has impacted the gaming industry as the Voodoo and TNT have. Today, all the above (well, maybe not Matrox, and for sure never again from 3dfx) produce ultra fast cards, and both ATI and nVidia feature integrated T&L in their chipsets. As fast as these cards are, they don't do much to convince us we're living an experience, instead of just playing in it.
Like previous nVidia based video cards, there is a wealth of different manufacturers for you to choose from. Typically, you'll want to buy from a well known distributer. Driver and technical support is always more readily available than the fly by night ACME Corporations. You pay more for a brand name, but will get a better supported product. This has changed since nVidia began releasing it's unified reference drivers. Now, it didn't matter who you bought from, since drivers, minus the driver support, from nvidia worked for any card that had their GPU on it. With the sheer number of nVidia users and online forums, tech support is easy to find. I still try to buy brand name hardware though, since usually the overall build quality tends to be better than generic, like bigger heatsinks, or extras like TV-Out, but you will pay more for this.
The subject for torture testing today is the MSI StarForce 822 GeForce 3. I'm sure many of you are familiar with MSI as a motherboard manufacturer, but probably not as much as a video card maker. Surprisingly enough, they've been making video cards for a few years now, starting with the TNT series of nVidia's chipsets. They've always made quality motherboards, the MSI-6119 BX is my all-time favorite, and they're pretty solid with their VGA offerings. Earlier builds of the StarForce 822 had some issues like older revisions of the GeForce 3 GPU (GF3). While most used the A5 revision, which is more stable and complete, MSI used the A3 revision in it's rush to be among the first to market. Their earlier cards also lacked heatsinks. In their defense, it seems most of these older cards were only in the hands of testers and reviewers mostly. A few may have slipped to market, but the one we're going to look at is the one you'll likely find, and thankfully, all previous issues are resolved. Let's start with the review.
Specifications
nVidia® GeForce 3 Chipset
64MB DDR SDRAM
4X AGP
256-bit Graphic Architecture GPU
57 Million Transistors
800 Billion Operations Per Second
7.4GB/sec Memory Bandwidth
200MHz Core Clock
460MHz Memory Clock
Supports Windows® 9x, ME, NT, 2000
TV-Out, Video-In & DVI
True, reflective bump mapping
High-performance 2D rendering engine
High-quality HDTV/DVD playback
API support: OpenGL 1.2 and lower, DirectX 8.0 Version and lower
GeForce 3 Specific
nfiniteFX" engine for full programmability
Lightspeed Memory Architecture for unmatched performance
Surface engine for high-order surfaces and patches
Programmable Vertex Shader
Programmable Pixel Shader
HRAA: high-resolution antialiasing
Integrated hardware transform engine
DirectX® and S3TC texture compression
Dual cube environment mapping capability
Hardware accelerated real-time shadows
So, what does the GeForce 3 promise? Like I said, speed and image quality. I know that many online tech sites, as well as testing here, report that the GeForce 3 isn't terribly faster than the GeForce 2. In fact, in some cases, it's slower. You'll notice that although the memory speed is equal to that of the GeForce 2 Ultra, but the core speed is actually 50mhz slower. What this will mean that for many of todays games (Direct X v7 games, and most pre August 2001 games), the GF2 Ultra will be faster at resolutions of 1024x768 and lower. There is much more to the story though, and we'll get to that later on.
Although the only APIs listed are OpenGL 1.2 and lower, and DirectX 8.0 and lower, newer DirectX versions will still run on the GF3. Believe me, I get this question a lot. The video card will work, but any new features just won't be supported in hardware.
I opted for the TV-Out version, which seems to be an option from what I've seen. The vanilla version is about 30$ less. I'll admit, I don't really know why I bothered since I hardly use it. In fact, I only tested it just to see how it worked. It doesn't help that the reference drivers disable this feature, and TV resolution sucks anyhow. The picture was adequate, though a little dark in my opinion. I didn't test it on a high end TV, so that may have affected my picture. Anyhoo, I don't think many play PC games on their TV, but I figure it'd come in handy one day if I demote the StarForce to an entertainment system where I want to pipe DVD movies to my TV. Or, I coulda put that 30$ savings towards a set top DVD player. doh!
The nfiniteFX Engine
"With the nfiniteFX engine's programmability, games and other graphics-intensive applications can offer more exciting and stylized visual effects. Vertex and Pixel Shaders are two patented architectural advancements that allow for a multitude of effects."
What the nfiniteFX Engine does is that now developers can code various special effects however they want since the new GPU is fully programmable. They're no longer limited by a set of coding rules imposed by the GPU, and what this will mean is there will be photorealistic games, especially those coded withen the specifications of Direct X 8, which the GeForce 3 is fully compliant. There are two main components that make up the nfiniteFX Engine, vertex and pixal shaders. According to nVidia's description, a vertex is the corner where two edges of a triangle meet. Because polygons are made up of triangles, being able to change their values will create a more realistic scene. The is a great example of vertex shaders in action. Pixal shaders is used to create textures and surfaces. Rather than placing a bitmap on an object, pixal shaders can modify values on a per pixal basis. This makes for, again, more realistic images. The shows this feature off.
Lightspeed Memory Architecture
"The Lightspeed Memory Architecture brings power to the GeForce3. That's why the NVIDIA GeForce3 is the platform of choice for the Microsoft® DirectX® 8 application program interface (API), and the technology foundation for the Microsoft next generation game console, Xbox"."
Memory performance has always been an issue with previous GeForce video cards, as well as most other manufacturers as well. If the GPU is pumping out more information than the video ram can take in, a bottleneck occurs. The Lightspeed Memory Architecture is designed to address this issue. According to nVidia, by means of a crossbar-based memory controller, the GeForce 3 avoids bombarding the AGP bus with texture information and makes it much more efficient, by using twice the available memory bandwidth. Previous GeForce cards wasted bandwidth by not using all of the available bandwidth. Think of it like Fat16 and Fat32, where Fat32 is more efficient. Ok, that was a poor comparison, but that's the general idea.
Back earlier in the review, I mention that at 1024x768 resolutions and lower, the GF2 Ultra tends to be faster than the GF3. The reason for this is that these low resolutions don't tax the memory of the GeForce cards. Therefore, the GF2 raw core speed is pumping out the data, and it is 50mhz faster than stock GF3s. At higher resolutions though, the memory will start choking at the amount of data flooding in. This is where you'll see the immediate benefits of the Lightspeed Memory Architecture, as it handles the extra data with aplomb.
High-resolution Antialiasing (HRAA)
NVIDIA's patented high-resolution antialiasing (HRAA) generates high-performance samples at nearly four times the rate of GeForce2 Ultra, while delivering the industry's best visual quality.
3dfx pimped antialiasing (AA) with their last Voodoo cards, and nVidia has taken it a step further with the GeForce 3. AA gaming is becoming more of a reality now, since previous AA implementations were slow. Image quality wise, I don't notice nVidia's HRAA to be any better than their standard 2x FSAA. Speed was close, as the difference between the two was hardly noticable.
Unlike previous GF2 cards, I feel that FSAA is very playable right now, but I still wouldn't use it at 1280 x 1024 resolution and up. Sure, the bechmarks look ok, but in reality, there will be too many peaks and valleys in terms of framerates when playing an intense action game. Although I didn't benchmark it, I was surprised Max Payne ran great at 1600 x 1200, with HRAA on. Don't ask me why I did that, I just wanted to see. It was completely playable, even in the big gun fights, and looked fantastic. I did run into a problem though, but I'll explain at the end of the review.
Personally, I prefer playing at a higher resolution, with any anti-aliasing turned off. Max Payne is a slower game (movement that is) so it wasn't an issue with AA on, but since my gaming preferences revolves around Quake engines, framerates are more important than pretty, blurry pixels.
If you want more information on these topics, I fully recommend reading the GeForce 3 technology guides at , and at . Those sites will be able to explain it a lot better than I will.
The Card
Like I said earlier, this version of the StarForce 822 GeForce 3 has TV-Out and memory heatsinks. Unlike most video cards, the heatsink is actually one big one with a fan in the middle. This provides cooling for the GPU and the ram. It's debatable if the heat from the core gets transfered to the memory, thus decreasing overclocking possibilities, due to the one heatsink-to-share design. I don't know. Maybe someelse can try, but overclocking is always a mixed bag. Other than that, not much to say. I thought it did look cool, because it was silver, but it doesn't look as pretty as the Hercules 3D Prophet and their pin/orb design.
On the right, we have the TV and S-Video out. MSI includes the S-Video cable, but you need to buy your own TV-Out one. Why they didn't include it already is beyond me, but since I'm not going to make much use for it, who cares.
Software
MSI includes quite a substantial software bundle. No games are included, but version 3 of PowerDVD is. I've always prefered PowerDVD as a software based player, and it's nice that MSI packages the latest version. Image quality is decent, at least as good as the previous version of PowerDVD. I do find image quality better with WinDVD, but I hate their interface.
They also include their 3D! Turbo 2001 software, which is a small app that contains common display properties. Nothing to write home about, and I didn't bother reinstalling when I updated my video drivers. Speaking of which, the version that was included with my video card were v12.00. MSI offers the Detonator XP drivers on their site, so definently don't use the old ones.
Finally, they include the Ulead Video Studio for video work or something. I'm not sure since I never did any video editing myself before, but for those into these things, enjoy. I didn't bother installing it myself.
Drivers
The MSI driver CD is really out of date. At the time of purchase, approxiamately June of this year, the drivers were version 12.00. I've been running v12.41 since day one, and the newer Detonator XPs since their release. For those who don't know, the newest detonators average 4% to as much as 15% increase in OpenGL speeds at resolutions 1024 and above. Direct 3D benchmarks seem to gain the most from my tests. Your results may vary. Like I said earlier, MSI now offers those drivers on the web. I already installed the reference drivers earlier, so I didn't bother with MSI's Detonator XPs. I figure you'll need them if you want TV-Out and for your display windows to say StarForce 822, rather than GeForce 3.
Update: It seems the drivers available on MSI's site are in fact the nVidia reference drivers. They are infact, not repackaged as I had previously thought.
Overclocking
nVidia doesn't enable overclocking tabs by default. You'll need to hunt down and run the "" registry hack to enable the sliders. According to their website, the 3D!Turbo tool is capable of overclocking as well, but either the shipping version is missing it, the website is wrong, or I'm just plain blind. Either way, so long as you aren't using the WQHL drivers, the will work.
Overclocking the MSI StarForce 822 was a mixed affair for me. Given the impressive looking cooling the video card had, I expected some decent overclocking results. With the system temperature averaging about 31C, and CPU idle temps about 38C, I managed an overclock of 230 core, and 515 memory. This is up from the stock 200/460. I did manage 235/530, but the video card would start giving me weird texturing and tearing after a half hour or so. I actually got up to 240/530, and it was able to boot into Windows, but then it'd just reboot, forcing me into safe mode. I thought this was a driver issue, but updating to the Detonator XPs didn't resolve this issue for me. Still, 230/515 is a nice bump. I was able to run my 3D Mark 2001 batch test which goes for about 8 hours, and all was well.
Update: Well, I'm getting odd lockups with a 230/515 overclock now. I've still managed to finish the benchmarks and write up the review, but it seems that even 230/515 is too high. Let this be a lesson about overclocking. Doesn't matter what you see on the 'net, everyone's milage will vary. For the record, I know 220/500 is rock solid for my video card. Good luck in your dreams of GeForce 3 Ultra dreams.
Before I get into the bechmarks though, I do want to say that like CPUs, video cards will eventually degrade over time if you overclock it. It's never happened to me, but I see a lot of images or read stories of image corruption after a year of oveclocking, and the video card continues to do so even after clocking it back. This happens to CPUs as well, but usually not as quickly. Performance gains are negligable in real world apps, and I don't think it's going to make much of a difference whether you overclock or not. But hey, it's your 350$ GeForce 3, so it's up to you.
Benchmarking
Here are the test machine specs:
AMD Thunderbird "AYHJA" Core, 1.4GHz (10.5x133)
ABIT KG7-RAID
512MB Kingston Value DDR ram
2 x 60GB Maxtor Diamondmax, RAID-0
MSI StarForce 822 GeForce 3, Det. XP v21.81
Creative Annihilator 2 Ultra, Det. XP v21.81
Windows 2000, Service Pack 2
VIA 4 in 1 v4.33
AMD Driver Pack v1.20
Quake 3, v1.30 Final
3D Mark 2001
I included the GeForce 2 Ultra here for comparison reasons. It's only going to be included in a few benchmarks, but I will be including the overclocked scores of the StarForce 822. Since I've tried to save some time, I decided to run scripts for 3D Mark 2001, so no more fancy screenshots. You'll have to settle on bar charts I'm afraid. If I get enough support, I'll include pink coloured bars to make up for the lack of 3D Mark screenshots. I've decided to skip benchmarking with the shipping drivers, as I'm sure most of you will choose the latest drivers from nVidia. If you want comparisons between the Detonator XP drivers and the previous official reference drivers, I'd suggest reading my quick Detonator Comparison article from September 2001. Those scores will vary a bit from todays, due to a reinstall of my OS recently, but the comparison is pretty accurate between the old and new detonators.
Fear not, I chose not to bore the reader with 15 different benchmarks, but I will present two, Quake 3 and 3D Mark 2001. I decided to skip Unreal Tournament because the game tends to be more CPU intensive than video. Make no mistake though, a 2MB ATI Rage won't cut it for UT.
Benchmarks
Quake 3 Arena, Demo four, High Quality, Sound On, No AA, 32 Bit Colour
As revealed in the specifications earlier in the review, the GeForce 2 Ultra takes a brute force approach in the benchmarks. Because the video card isn't taxed all that much at low resolutions, it manages to keep pace with the GeForce 3. As resolution increases, we can see the GF3's Lightspeed memory architecture pulling away, particularly when overclocked...
Quake 3 Arena, Demo four, High Quality, Sound On, 2 x AA, 32 Bit Colour
With FSAA enabled, things get a little rough on the GF2 Ultra. There's a lot more information being processed, and thus increasing demand on the memory pipeline. At the lower resolutions, it's still playable on the Ultra, but this is where we see the GF3 showing it's muscle. I didn't include overclocked scores because although this is the second graph I'm showing, it's actually the last benchmark I did. I mention last page that my video card was crapping out on me at this point. Oh woe be me....
Quake 3 Arena, Demo four, High Quality, Sound On, 4 x AA, 32 Bit Colour
At 4x FSAA, I don't feel it's worth mentioning the GF2 Ultra anymore. At 640x480, it is benchmarking in the mid 50fps range. Sure, it's great when you're walking in a square room, but once the fighting starts, good luck maintaining that speed. Here, we can see the overclocking paying off at all resolutions on the StarForce 822. The GF3 is handling the increased bandwidth requirements quite well, but don't bother with 4x FSAA at the higher resolutions. Don't ask me why 1280 performance is less than 1600 performance. I did the test 3 times, and got the same results. Even more odd was the fact that it happened again when overclocking the video card. I don't expect this to be the case with others, so I'm just going to label this as a lab error.
Quake 3 Arena, Demo four, High Quality, Sound On, Quincunx AA, 32 Bit Colour
The whole deal with Quincunx AntiAliasing is that you get near 4x AA quality at 2x AA speeds. As you can see in the benchmarks, it is close indeed. I'll have some screenshots later on, but I have to say, I don't like the Quincunx AA. I find the picture noticably blurrier, and although the benchmarks are similar to 2x AA, actual gameplay felt choppier.
Benchmarks
The news around the Detonator XP drivers is that Direct X performance gets a big kick in the butt. Um, I mean that in a good way. OpenGL shows gains, but it's D3D that gets the biggest boost. For 3D Mark 2001, I stuck with antialiasing off in all tests. With AA on, you'll get the usual loss in performance but fewer jaggies. This is a synthetic benchmark anyhow, so general performance should be enough of an indicator. I'll let the benchmarks speak for themselves.
3D Mark 2001, Bit Colour, NoAA, Z-Buffer 24 Bit, DXTC, D3D Hardware
So, like the Quake 3 scores, the StarForce 822 runs rampant over the GeForce 2 Ultra. More to the point, the Detonator XP drivers do seem to reveal the GeForce 3 it's true power. Overclocking nets even better results.
Image Quality
I displayed a couple of screen shots of the nVidia demos, but I'm sure you'll be wondering how todays games will look. I'm sorry to report that games don't really look any better than they would on a GeForce 2, sort of. Games will look better if you either increase the resolution, turn antialiasing on, or both. Some games, such as Giants, have patches or options to really increase the polygon count to take advantage of GeForce 3 features. You'll still need to wait a little bit for more cutting edge Direct X 8 games to dominate the market, and only those that need the GeForce 3 features. Anyhow, below and on the next page are screenshots of Quake 3, demonstrating the antialiasing quality of the card. Note the blurriness of Quincunx AA that I mentioned earlier.
I really like the 4 x AA, but it was fairly choppy while playing. Sure, standing still looks nice and steady, but doing that long enough will leave you staring at the ceiling, hehe, erm..., you know, you stand there? Get shot? Die? And end up looking at the ceiling? Nevermind...
One game that looks really good though is Max Payne. That game has jaw-dropping graphics, and even at 1600x1200, the game was as smooth as butter. You can take a look at the screenshots in my Max Payne review.
Final Words
There is no doubt, the GeForce 3 is one heck of a great video card. Thankfully, the price has dropped considerably since it was originally announced. Street prices range from 300$ to 350$ depending on the brand and extra features. FSAA performance is finally playable in action games, to an extent. MSI includes a decent software bundle and a sweet looking card.
A couple of things do concern me though. There aren't many big Direct X 8 games out there, so a lot of the GeForce 3 "new" features go wasted. Also, with the announcement and reviews of the GeForce 3 Titanium, people may skip this card and go for the newer ones. The Detonator XP drivers may keep the new ATI video cards at bay for a while, unless ATI releases some new XP drivers of their own. Overclocking was also a bit of a letdown, but it may just be my part that is at fault.
Still, even with the apparent lack of games and the newer Titanium cards being released, there really isn't any reason not to get the StarForce 822 GeForce 3. It's a solid card, has optional TV-Out, a comprehensive software bundle, and no doubt, a soon to be lower price. Even with todays games, you can now run at higher resolutions, or turn on FSAA without slowing your games down as much as before. With features for the games of tomorrow, your GeForce 3 investment will last a long time. At least until nVidia's next refresh cycle...
MSI:
90%
Pros: Very fast, especially at resolutions 1024 and over. Decent FSAA performance, DVD software included, as well as TV-Out. Decent pricing now.
Cons: Questionable value until more DX8 titles hit the market. Faster video cards due to ship soon.
Update: Well, wouldn't you know it? No sooner than I post the review online, I found out why my benchmark scores were lower than others you see online. Granted, not everyone has the same test platforms, but if you're running an AMD/Windows 2000 system, pretty much regardless of what motherboard chipset, be sure to run the . After doing this, I got a big boost in framerates. I had time to do the AA off benchmarks, but I expect similar boosts with varying levels of AA on.
Quake 3 Arena, Demo four, High Quality, Sound On, No AA, 32 Bit Colour
Morale of the story? Make sure you got all the drivers and patches for your software installed. I couldn't believe I forgot to do this, and the extra performance was shocking. Well, not really, but I noticed I didn't use the work "shock" in the review.
Grab the patch from AMD here:
Home>>
|