The high-end spectrum of video cards is ruled by a mere few, but on the low to mid-ranged market, the choices are plentiful. With fewer dollars for consumers to play with, the wrong choice will result in a card not well suited for their needs. Cards get very specialized at the sub-$200 price point, and while they are not going to be gaming kings like $400+ hardware, expectations are they should be able to support modern software and provide flexibility in the application the buyer decides upon.
The is Micro-Star International's vision of the ATI Radeon HD 2600 XT. The artwork on the box has a Borg Queen inspired avatar and is complemented with a variety of marketing points to draw attention of the potential buyer. The key graphic is the "OC Edition" which in MSI's Diamond series of video cards means these are factory overclocked. Granted, overclocking is pretty easy with the variety of tools available online, but this overclock out of the box is fully covered by warranty by MSI.
The MSI red, product model and female avatar are present on the card's cooling surface. The PCB surface itself is black. AMD's reference card runs along at 800MHz core built on the 65nm fab process. The MSI RX2600XT Diamond (unless otherwise stated, for the rest of the review we'll simply refer to the card as the MSI RX2600XT or some derivative of it) overclocks the core to 850MHz. DirectX 10 with Shader Model 4.0 is supported by the MSI RX2600XT.
The MSI RX2600XT is not an overly large card, measuring right at 9.5" from back to front. Including the PCIE connection, the card is 4.5" high. Scattered along the back are the memory chips.
The MSI RX2600XT has 512MB of Samsung DDR4. Like the GPU core, the memory is also overclocked from the reference 2.2GHz to 2.3GHz on MSI's product. The memory here is not actively or passively cooled, so right now, we don't have high hopes for overclocking success with the memory.
The MSI RX2600XT features a large two slot cooler, which is needed given the overclock from the factory. Despite the large size, the MSI RX2600XT is surprisingly quieter than one may expect.
The shot above shows that the heatsink is really limited to the GPU itself. The memory beneath the cooling sports heatsinks of their own. The GPU heatsink is constructed out of aluminum and features heatpipes to assist in the cooling. It works great for CPUs so why not for video? That said, we feel that MSI would have been better served to move the cooling closer to the video card's PCB and put additional cooling on the memory on the opposite side of the video card.
Included with the card a user manual, a Driver/Software CD, one DVI to VGA adapter, video out cables for S-Video, composite and component. There is also a DVI-to-HDMI adapter as well which can output both video and audio.
Above we can see the outputs for the MSI RX2600XT. The two DVI connections both support HDMI with the included adapter. The output at the end are for the previously mentioned analog output cables. The grate next to the outputs is for the hot air to be exhausted from the cooling. It probably would be better to seal off the heatsink shroud so that hot air exhausted from the cooling actually exits the case. There is no seal here and there is noticeable airflow around this area, not all of which exits from the rear.
Operating System: Windows XP Professional (5.1, Build 2600) Service Pack 2
Processor: AMD Athlon 64 X2 Dual Core Processor , MMX, 3DNow (2 CPUs), ~2.6GHz
Memory: 2048MB RAM
Going up against the MSI RX2600XT will be MSI's own NX 8600 GTS.
Test Software is as follows:
Enemy Territory: Quake Wars
Unreal Tournament 3
Call of Duty 4
World in Conflict
We selected the best resolutions where the games were playable and image quality did not suffer. 2x Anti-aliasing was enabled and the control panels of each card were set to quality.
Bioshock was tested at 1680x1050 for both cards, but for the rest of the tests, the MSI 2900XT was run at 1440x900. The MSI 8600GTS couldn't use that resolution, so we dropped back one to 1360x768. This didn't give us a true apples-to-apples, but any resolution higher made it difficult to play on the 8600GTS.