Viper MOTD: Slashdotted and lived to tell about it...
 




















MSI MX440-T8X
AMD TBred 2400+
MoBorg Case Mod
OCZ PC2700 Rev3.2
iRock! BLiNG MP3
Lian-Li PC65U Case
HSPC Shielded IDE
MSI StarSpeed DVD
CM Alu & Cu Ramsink
Titan Aluminum Fan




 
 
GeForce 4 MX Profile
 
 
Date: April 10, 2001
Catagory: Articles
Manufacturer:
Written By:
 

I'm sure by now, all of you have at least heard about . It was announced officially back in February, and by now, you've probably have seen dozens of GeForce 4 reviews. No doubt, you've noticed the MX and Ti tagged cards, and although they share a common name, they are not the same card. There are plenty of guides, and information on the GeForce 4 technology, but we're presenting some info for the MX series. Why focus on the MX? It'll be more clear as we move forward in the article. We'll be covering the GeForce 4 Ti technology in an upcoming review.

What is the GeForce 4 MX?

Unlike the Titanium cards, which are based off NV25 technology, the MX is based on NV17 technology. Consider that the GeForce 3 is based on NV20 tech, and the GeForce 2 on NV15, you'll see where this is going.

With the GeForce 2, the MX was merely a slower part. Performance was more, or less equal with the "vanilla" GeForce 2 (GTS), as well as the faster GeForce 2s (Pro, and Ultra) at low resolutions, but at high resolutions, the MX started to lag behind. Granted, the MX was intended more for casual gamers, and the business folk. It sold very well because it was cheap, and it came out at a time where the majority of users owned 17" monitors, where 1024x768 resolution was more common.

When the GeForce 3 arrived, a lot of people expected the MX and Ultra to follow soon after. nVidia dropped the MX and Ultra tag, and went with the Titanium branding. There were two flavours of Titanium, the Ti200, and Ti500. The Ti500 was the high end part, the Ultra, and the Ti200 stepped in for the MX as the budget alternative. Despite being relatively cheap, a lot of users were able to attain excellent overclocking numbers with them, sometimes matching, or even surpassing Ti500 speeds.

For both the GeForce 2 MX, and the GeForce 3 Ti200, they shared the same feature set as the rest of their family, and it was a lot cheaper than the others. The MX is back now, and things have changed a lot for the GeForce 4 MX. To begin with, unlike the rest of the GeForce 4 family, the GF4 MX lacks the nfiniteFX II Engine. What this means for the user is that all the fancy programmable pixel shaders, or the advanced DirectX 8 features, are not supported in hardware. The MX still has Transform and Lighting capabilities, and limited vertex shader support.

The GF4 MX does share some of the same features as the full fledged GeForce 4, such as the updated Lightspeed Memory Architecture (LMA II), nView, and the new Accuview AntiAliasing. We will be covering each feature set a little later, but there are a few differences between the MX implementation, and the Titanium implementation. For LMA II, rather than 4 independent memory controllers, we have 2. Before you totally freak out though, previously it was 4 x 32bit memory controllers, and with the MX, it's 2x64bit. Therefore, the size of the path is essentially the same, but given the fact you only have 2 controllers, it is possible there may be a bottleneck with certain settings, but I can't say for sure. You see, using web servers as an example, it's better to have several small servers in a cluster, than one or two big ones. For servers, the former is cheaper, but it's also more efficient. I don't know if this holds true for the crossbar memory architecture, but it's something to think about.

nView is something that existed before, but it's been greatly improved this time around. For graphics professionals, you now have access to a cheap gaming card, and power dual monitors the same time. Other cards do this, but probably not at such a low price point. We covered nView previously, so feel free to peruse that article.

For those of you who absolutely must have AntiAliasing (removing the jaggies), the MX has that also. It's been improved greatly since the GeForce 3, and is much clearer and faster than before. We're testing the MX 440 AA capabilities now, so we can continue this topic then.

That being said, what else do you have to know? Well, it's going to be cheap, and come in 3 flavours. The fastest (insert sarcasm here) MX will be the MX 460. The MX 440 follows, and the slowest of the group will be the MX 420. nVidia is going to be officially supporting 64MB cards, and while the 460 and 440 will support DDR, the 420 will be saddled with SDRAM. Pricing should range from 180$ for the 460, to as low as 100$ for the 420. While this may seem low, consider the Ti2000 GeForce 4 is going to be in the 200$ range. For hardcore gamers on a budget, you'll certainly want to wait for that card.

The fact is, the MX is intended for the mainstream market. If you're wondering who that is, it's about 80% of the entire PC market. Out of the 80%, probably another 80% are corporate users. Their IT guys aren't going to want to dish out 400$ for over 800 video cards.

For todays games, not including all 3 Direct 3D v8 games (insert more sarcasm), the MX should perform very well. We have an upcoming MX 440 review (likely tomorrow), so be sure to check that out, but early indications show that for a low cost card, it's actually not bad. Newer titles, well, that remains to be seen. I don't think anyone with a GeForce 3 or Radeon 8500 should dump those just yet.

Home

 
     
 
 

Copyright 2001-2002 Viper Lair. All Rights Reserved. Site Design by