In November 2006, NVIDIA launched their big time enthusiast video card, the 8800 GTX. The 8800 series (G80) took a real first step in graphics technology, including full support for DirectX 10 and Shader Model 4. Since then, there have been several different flavours of the G80 catering to enthusiasts to mainstream users. While the $200 to $250 price point did have options, there really wasn't a lot to get terribly excited about if you were a serious gamer.
Fast forward a year later, where just last week NVIDIA unleashed their 8800 GT. Ticking in a hair on average, NVIDIA's latest barely fits under the mainstream price point, but it offers the potential for enthusiast level performance.
||GeForce 8800 GT
|Core Clock (MHz)
||600 MHz (660 MHz for MSI)
|Shader Clock (MHz)
||1500 MHz (1650 MHz for MSI)
|Memory Clock (MHz)
||1800 MHz (1900 MHz for MSI)
|Memory Bandwidth (GB/sec)
|Texture Fill Rate (billion/sec)
The chart above outlines NVIDIA's reference specs. Of note, the 8800 GTX has 128 Stream Processors and the 8800 GTS currently has 96. The latter number will change to match the 8800 GT's 112, but that is an interesting statistic considering the price. The memory and core speeds are also higher than both the GTS and GTX, though the memory interface is lower than both cards.
MSI NX8800GT-T2D512E-OC Video Card
Today we are going to be looking at a final, full retail version of MSI's NX8800GT video card. The MSI product is based directly off of NVIDIA's reference design with MSI's own custom artwork on the cooler, so for now anyway, don't expect anything out of the ordinary visually in the physical package.
The box art features the same CGI female as the one silk-screened on the heatsink. The NX8800GT was wrapped in an antistatic back and placed in it's own foam compartment. In the lower compartment, all the extra cables and accessories are tucked away here.
The MSI NX8800GT had no problems fitting in our Cooler Master Stacker and Antec 900. Should your case have fan attachments or hard drive clamps in the general area where the rear of the card would be, you may have to do a bit of house work to clear the general area up. The card measures 9" in length, and the PCI-E power connection faces towards the front of the ATX case, so account for about a half inch for the power connection.
As the product name, MSI NX8800GT-T2D512E-OC, implies, the card is overclocked out of the box. The GPU core is clocked at 660 MHz, 60 MHz above reference. The shader clock is also overclocked, running 1650 MHz, 150 MHz faster than the reference design. It is a PCI Express 2.0 design, which is forward thinking in upcoming board designs, but is also backwards compatible with current motherboards.
The MSI NX8800GT is equipped with 512MB of GDDR3 memory clocked at 1900 MHz and has a 256-bit memory interface capable of delivering up to 57.6 GB per second of memory bandwidth. While the 256-bit memory interface is a bottleneck when compared to the wider pipes of the GTS and GTX, MSI speeds the memory along 100 MHz above NVIDIA's specs.
The GPU itself is based on the 65nm fab process, a shrink from the 90nm of the G80 series. The hardware features a fully unified shader core which dynamically allocates processing power to geometry, vertex, physics, or pixel shading operation. High Dynamic Range (HDR) lighting capability is present and will support 128-bit precision (32-bit floating point values per component). This will obviously improve image quality and allow for more true-to-life lighting and shadows. Dark objects can appear very dark, and bright objects can be very bright, with visible details present at both extremes, in addition to rendering completely smooth gradients in between.
Another new item, originally introduced with the G80, is Quantum Effects GPU-based Physics. Just as the name implies, physics calculations will be handled by the GPU creating a more realistic game environment. It will also free up the CPU to handle other items such as game AI.
The MSI NX8800GT uses the same single slot cooler as NVIDIA's reference design. The heatsink is aluminum based and makes full contact with the core and memory on the card. This makes the cooler itself more efficient so to speak as it's doing a bit more than just keeping the GPU cool. The cooler itself does get very warm though, but not warm enough to cause instability in our test setup. Despite the rather smallish fan, on initial power up, there is a high pitched whine that lasts about 2 seconds before it throttles back to a much quieter mode.
According to NVIDIA's specifications, the power draw is much lower with the new 8800 GT, using about 110W under load. Obviously, if you're running SLI, this will use up more power which is why we suggest using a quality PSU in the 500W range and up. The MSI NX8800GT requires a PCIE power connection, which most modern PSUs have, but MSI does include a dongle out of the box.
Other than the card, MSI tosses in the required cables, as well as one DVI-to-VGA adapters. There is also the manuals and driver CD, but that is it. No game bundle is included, which unless we saw an A-list title, we have no problem with this decision if it keeps the consumer cost down.
Those of you on the HD bandwagon should know that the MSI NX8800GT is HDCP compliant. Above, we can see the outputs for dual-link DVI and VIVO. HD-DVD and Blu-Ray media is also supported via NVIDIA's PureVideo HD technology. The card supports hardware decode for playback of H.264, VC-1, WMV and MPEG-2 HD and SD movies. High-Bandwidth Digital Content Protection (HDCP) is supported by the hardware, allowing the playback to supported screens of the aforementioned discs
AMD X2 5000+
MSI K9N Diamond SLI
2x1024MB Corsair XMS2 8000
Windows Vista Business 32-Bit
NVIDIA Driver Version: ForceWare 169.04
We'll be pitting the MSI NX8800GT directly against their own NX8800GTX. We do not have ATI's latest to compare against, but the NX8800GTX should make for an interesting comparison as it is considerably more expensive than the 8800GT.
The games to be used for benchmarking are as follows:
Cysis SP Demo
Unreal Tournament 3 Demo
Call of Duty 4 Demo
The driver settings were manually configured for AntiAliasing/Anisotropic Filtering disabled and set to "Quality" via the video driver's control panel. All games were set to their highest allowable default game settings, with the exception of Crysis which we will explain shortly in the benchmark tests. All games were tested using FRAPS, with audio enabled and a set resolution of 1680x1050.