Recent Reviews

Everglide Giganta + Skatez
AMD Athlon XP 2000+
Stomp RecordNow Max
Asus A7V266-E
Vantec Stealth Fans
Mouse Bungee
Ratpadz Mouse Surface
Visiontek Xtasy 6964
Visiontek Xtasy 6564
Thermaltake Volcano 7+

More Reviews here...

Recent Articles

Mouse Ball Cleaning
Mouse Wars: Ep. 2
Lian Li & 8500DV Mod
Slipstreaming Office XP
Battle at 650MHz
Tidy Your PC Interior Pt 1
Windows XP Activation
Protect Yourself
Video Card Choices

More Articles Here...

Add to Favorites
Make this your Home
Got news? Send it

Affiliates

Links
































Link to us:

How to choose a video card for Gaming
 

Written By:
Date Posted: September 29, 2001

What video card should I buy? The generic answer is usually what do you want to do on your computer. A person who will do nothing other than web surfing and office work can get by with integrated video, or an ultra cheap AGP or PCI addon video card. A casual gamer will need a good 32MB video card, while a hardcore gamer will go all out with a 64MB top end gaming card. Professional designers and multimedia authors need video cards that likely cost more than your PC. I'm not going to BS you with 2MB video cards, since nobody I know makes them anymore. 16MB AGP video cards will be the minimum, unless the PC already has integrated video (yuck). I'll talk briefly about the choices, but it'll be up to you to research the technology some more.

Your momma!

Unless you have your own computer, chances are you're using one that belongs to someone else. Judging from the maturity level of some people I meet online, the computer is probably owned by their parents or a relative. It's likely they don't play the latest Quake 3 engine powered game, and more likely they play rather intense games of Solitaire. Given the purpose of these computers (read: Internet Surfing), they don't need anything fancy. If your family is shopping for a new PC, now would be the time to ask Mom or Dad to get a decent video card. Parents will only care about one thing, price. In that case, here are the best candidates where the price tag won't freak them out.



All these video cards should be available for under 100$ US. In my opinion, the Radeon would be the best choice. It's fast, has great image quality and nice extra features. It's also good that ATI has announced that it will be more frequent with their driver updates, something that they've been lacking in the past. Unlike most video card manufacturers, ATI develops their products and builds their own cards. Well, they have someone else put it together, but the point is, any ATI video card you see has an ATI video chipset. This will make more sense when I explain nVidia's situation.

The GeForce 2 MX will likely be the fastest video card out of the three. nVidia develops their chipsets, but licenses other manufaturers to make their video cards, such as Hercules, MSI, ASUS, CardExpert, Visiontek, and so forth. Every card has the same underlying technology, and differs only in warrenties offered, and/or extra hardware and software extras. Should you go brand name or generic? It doesn't really matter since nVidia offers a unified driver for all of it's chipsets. If you're nervous of a warrenty with an ACME company, go with a brand name, though you should expect to pay 10-20$ more on average.

The Prophet 4000XT is the dark horse here. Hercules makes use of the Kyro chipset here, which is a fairly decent performer. The only issue here is that there are some known image problems with the chipset, and unlike the other two cards I mentioned, the Kyro lacks hardware based Transform and Lighting acceleration. this isn't too big a deal right now, but may be a problem with newer games that make use of this feature.

Update: Originally, I included the . There are a couple versions, one with TV-Out and another without. Both include 64MB of ram and provide great performance. They do approach the mid range level of cards in terms of pricing, but still low enough to be considered for entry level. At other sites, I've seen performance match, if not exceed the GeForce 3 in some benchmarks. A lot of factors come into play though, like the performance is better at certain resolutions, etc... Another concern of mine is compatibility and support. sure, it'll run Windows fine, so your parents won't care, but I've heard of weird effects in some games. you're best bet is to do some research into this, but it looks like if budget gaming is your thing, you can't really go too wrong with the . The same can be said about the GeForce 2 GTS cards and the Radeon 64MB cards. They are becoming rare, but still widely available. They're both in the price range of the 3D Prophet 4500, so I think it's fair to mention these cards as well. Personally, I'd go with the Radeon cards if I were to spend 150$, give or take. ATI has traditionally had better image quality and decent speed. Driver support had been shady in the past, but they seem to be improving.

I make my own money and can pay for my own sh*t, but I still need to pay rent...

Here is where the choices expand a lot more for you. Casual gamers need to look no further than the following:


Either choice would be a good one. As usual, the GeForce cards tend to be faster for action games, but ATIs look better, and still offer plenty of speed. Both are reasonably priced, and will provide plenty of speed for today's games, and upcoming games to be released this year. It doesn't support the new Direct X 8 features in hardware, but they shouldn't have any problems with them.

...but screw rent, I got 400 large burning a hole in my pocket!

In which case, there is only one choice, the . These are simply the fastest, and most expensive video cards you can buy. They are Direct X 8 compliant, and judging from the tech demos I've seen, games are going to look good.

Anyhow, there are dozens upon dozens of reviews out there to help you pick the right card. Personally, I prefer the reviews done at and . They're pretty good at giving thorough reviews, but more importantly, they have a wide range of games and in the case of FiringSquad, a wide variety of CPU setups. These sites will also devote a few pages to explain card "x" technology, though I usually skim through that and get to the benchmarks :)

Ok, that's nice and everything, but what matters most?

If you're a gamer, I don't care what anyone else says, but framerates are king. Image quality is important. I mean, there's no point getting insane framerates if the image looks like crap. At the minimum, you'll want a video card that will play your favorite games at 32 bit colour, 1024 resolution, and at 75 frames per second.

Whoa there... The human eye can't distinguish anything higher than 30fps...

That's BS, and I'll tell you why. The 30 frames per second most people refer to is full motion video, with "real" footage, like a movie. Real people aren't made of polygons, like a computer character, and there are some subtle effects like motion blur that fool the eye that smooth motion is happening. Motion blur was something 3dfx tried to pimp to us, but they're gone now. I'm sure motion blur will make it's mark, but not for a while I'd imagine.

Computer games work differently. No matter how realistic the characters look, they still have to move about. Characters, despite buzz words like mesh and skeletal animation, are still made of polygons. The number of polygons increase as game engines improve, but they're still polygons. To effectively fool the human eye into thinking they're moving realistically, you more or less need to double the standard of 30fps. That's 60fps for those who aren't good at math. Trust me, you'll be able to spot the difference between 30 - 60fps in a video game.

You can read more about this topic at . It's based on 3dfx's claims that frame rates are king, and the article seems to validate those claims.

Update: I didn't want to go to deep into this partcular topic, since it seems to be quite debatable. However, I've had an interesting email discussion with a reader, so I'll share some of our view points. Yes, I've edited for length, but not the content.

Reader
"It does not matter what type of object is moving, whether it be TV, games, or real life. Your eye simply cannot tell the difference at anything above 30FPS. Sure, someone with above average eyesight may be able to detect stutter around 40-45FPS, but beyond that, it's just a matter of overkill.

You want further proof? Can you really tell the difference between 80fps and 160fps? Your monitor is probably set to a refresh rate of 85Hz. That means the monitor will update the image on the screen 85 times a second. So any framerate above 85 is not displayed, since the monitor can only update the screen 85 times per second, not 160."

Me
"When I upgraded from a TNT 2 Ultra to a GeForce 3 DDR a few years ago, I went from an average of 35-40fps (benchmarked) to about 70-80fps (benchmarked). Now, I will agree somewhat about it being hard to tell, because although things didn't move faster (it looked slower in fact), the animation was definently smoother. Having a GeForce 3 now, and playing at 130fps (benchmarked), and 110fps - 180fps in real gameplay (you can tell this in Q3 by the on-screen fps counter in case you didn't know), things don't seem faster than they were at 80fps. However, I've been playing with this speed for since June, and when I played Q3, with the same settings I'm accustomed to on a friend's Voodoo 5500 PC, it drops to about 60fps - 70fps in real gameplay, and oh yeah, there is definently a drop in speed, or smoothness as what I mean to say.

It happens my monitor's max refresh rate is 120hz at 1280 resolution, and I play my games with vsync disabled, so no, I am not capped at 85hz. Does disabling vsync allow 160fps? I doubt it, but the point is, it doesn't cap at 85, and it likely is withen the 120hz range. BTW: My friend's monitor is 100hz at 1280 resolution.

I want to point out what I said in the article. I mention that the reason that there is such a demand for 200fps (those crazy nuts) is that a video card that does 200fps for a game today may not do that for a new game bought tomorrow. It's all about headroom."

Ok, I may as well point out that although we both brought up interesting points, we're sticking by our guns. My arguements for needing excessive framerates is made clear with the next question...

If so, why do people strive for more?

Headroom. When you see a Quake 3 benchmark score of 120fps, though it's a real world score, it's still not accurate. The 120fps is an average in the benchmark. Likely, at some points of the benchmark, it may have dropped to the mid 80s, and peaked in the 150fps range. When you play, you'll see this fluctuation, and a video card with the higher sustained frames per second will play the game better.

Also, if a video game benches in at 120fps, it's a good indication of the possible lifespan of the video card for performance gaming. Outside of extra features, any video card that does 100fps in Quake 3 will be a viable solution for newer games that come out. Quake 3 is getting old right now, but newer games, like Return to Castle Wolfenstein, are based on many of todays popular game engines. No dougt, there will be enhancements to the engines, but like I said, a 100fps Quake 3 video card should run RTCW at 80fps (benchmarked).

Bah! I don't play those games. Too violent :P

The choices get easier and cheaper if action games aren't your bag. If strategy or puzzle games (*gasp*) are your preference, a video card with good image quality and high resolution will be more important. Granted, most video cards already fall into this catagory, but the ones of note are the offerings from ATI and Matrox. Historically, their cards always provided a sharper and more vibrant picture, and will likely be cheaper if you opt for their "consumer" cards instead of the gamer cards. Well, I wouldn't say Matrox has the latter, but office productivity is what rocks their boat.

What's the future?

Who can say? Obviously, as games strive towards photo realism, a video card will need excessive fill rate capabilities, and support for whatever is the hot API at the moment. Right now, OpenGL is the popular one for ultimate gaming, but I do foresee Direct X overtaking within the next 8-12 months. New video cards will have loads of ram, and crazy clock speeds.

The arguement of waiting for something better to come out isn't a valid one. There WILL always be something better, and the trick is to buy a video card you need for the time you need it. Not everyone can buy a new video card every 6 months, so you need to take a look at your game inventory and evaluate what is it you play, and is the performance of your hardware hurting your game performance? If not, keep what you got. If it does, well, I laid the best solutions at the time of this writing out for you. With the GeForce 3 Ultra and the Radeon 8500 just weeks away, there hasn't been a better time to consider a video card than now.

Home>>

 

Copyright © 2001-2002 Viper Lair. All Rights Reserved. Site Design by PipersTrail
Subscribe to
Sponsors

Place Your Votes
Spring Refresh
Are you planning to upgrade?

Nope
Soundcard
Case Mod
Motherboard
Video Card
CPU



Site Design