Click here to go back to www.viperlair.com
|
|||||||||||||
nVidia GeForce FX Preview: After several delays, it appears that nVidia's next-gen product is on the horizon. We have a peek at what is dubbed the GeForce FX.
|
|||||||||||||
nVidia is in the unusual position of playing some catchup. For the past few years, starting with the TNT2, they've arguably been leading their competitors when it comes to image quality and performance. Their last product, the GeForce 4, simply powered past everyone else, and provided 3D features many gamers waited for such as improved AA and of course, faster framerates. With the release of the ATi Radeon 9700 though, the GeForce 4 now looks like yesterdays news. The Radeon 9700 provided people with many things the GF4 lacked, such as DX9 support, improved image quality, and most importantly for gamers, speed. Enter the GeForce FX, aka the NV30. nVidia promises this to be a no-excuses card, though of course, we'll need to look at final silicon before supporting that claim. What I can tell you that it plans to match, and exceed the features and performance of the Radeon 9700. How will it attempt to do that? Here are some of the features... 0.13u TSMC Micron Process Before getting into the new stuff, first, we'll go over some of the things that are familiar to most of us... The Fab Process, AGP 8X and Memory For those of you who follow the industry, you'll know that there was some delay with the move to the 0.13u process. Trying to fit so many transistors (count 125 million) is no easy task, but with the smaller fabrication process, the GeForce FX can run both faster and cooler. Although AGP 8X is nothing new for nVidia, this marks the first time that they have built a new card with native AGP 8X support. In theory, AGP 8X will double the available bandwidth in the interface, but how and when that bandwidth will be used will depend a lot on the applications. DDR2 is an upgrade to current DDR technology. DDR2 provides quite an improvement in memory bandwidth, and with Lightspeed Memory Architecture III, as well as an optimized 128-Bit Memory Bus, memory performance should be much better than before. In case you're wondering why this is important, high resolutions, millions of colours, and Anti-Aliasing all rely on these improvements. DirectX 9 and Cg The GeForce FX (GFFX) will support the DirectX 9 specification, and in fact, do more than that. DirectX 8 introduced Pixel and Vertex Shaders, and DirectX 9 upgrades the specifications... by a lot. The problem with the original specifications were that they were more customizable than programmable. Developers have to rewrite shaders according to hardware, making for a lot of work, and probably a reason why we don't see a whole lot of DX 8 games taking advantage of our card's features. With the introduction of Cg, and the new DX9 specifications, the Pixel and Vertex Shaders are far more programmable and unified than before. When we look at CineFX, you'll see why. CineFX Architecture The CineFX architecture is a new concept for nVidia. In a nutshell, it simplifies the creation of shaders for developers by putting shader execution support in hardware. In conjunction with Cg graphics language, it should be a lot easier to bring PC graphics to Toy Story (or better!) levels. VERTEX SHADERS 2.0+ PIXEL SHADERS 2.0+ All of this wouldn't be possible without the engine's ability to render true 128-Bit colour, and the ability to do this quickly. For the colour, the more red, green, blue and alpha values you have, the smoother the colours and graphics will appear. The CineFX engine also supports 1024 instructions in a single rendering pass. Objects such as fur and grass used to require multiple passes to render, but with the GFFX, only one. Intellisample Technology Anti-Aliasing has come a long way these past several years, but there are always improvements ready to be made. One problem with a lot of AA effects is images tend to blur, or look unnatural. nVidia improves upon their AA technique with a new 6XS mode. The 4XS worked by taking random spots of a pixel to create a more natural AA. The 6XS, as the name implies, takes 50% more samples and renders the scene. Antialiasing is the key to smoothing out rough edges, or "jaggies," that often appear on the edges of 3D geometry. The Intellisample technology's intelligent antialiasing captures a higher-resolution version of each image, then resizes and resamples it for output to your screen, smoothing away imperfections. The GeForce FX's new 6XS mode delivers silky-smooth visuals not seen in the current 4X or 4XS modes, calculating 50% more samples. This feature is available through the control panel, so even current games can take advantage of this stunning new level of clarity. Other improvements are the lossless depth Z-buffer and color compression improvements. nVidia claims improved image quality, through AA, with no performance hit. As you know, high resolutions and high levels of AA are bandwidth killers, so it'll be interesting to see how this turns out. Final Words We only provided a general overview of what I felt would matter for most consumers, but there are other features such as nView and Digital Vibrance Control which we didn't really cover, but you can find . Admittingly, I feel the GeForce FX will be something a lot of people are going to want to have for their "Doom" box. The specifications look impressive, easily better looking and faster than their GeForce 4. Given that it's at least double the Ti4600's speed, it should be 20-30% faster than ATi's best. This is only speculation, and we can only support their claims when we test the GFFX in house. The questions that then remain are, when, and how much? We're looking at a January 2003 release, and an estimated street price of about 399$, which is standard for all new top end cards. Should ATi sweat right now? Probably not, as their cards are available now, and provide plenty of speed for gamers. They better not rest on their laurals too long though, as the GeForce FX is looking to be a Radeon 9700 killer. |
|