Computer Games
You can clearly see the refresh in this picture
Like the TV, your Computer Monitor (if it's a Cathode Ray Tube) refreshes by drawing the screen line by line horizontally, but unlike the TV, a Monitor and Video Card doesn't add extra frames. If your screen draws at 30 fps, you will GET 30 fps. Since these are 30 perfectly rendered frames i.e. no motion blur, your eyes are not fooled in the slightest. Now we have already covered why we need motion blur; to maintain the illusion of smooth moving imagery. So why don't we add motion blur then? If we were to add motion blur in some fashion here, the imagery would be counter productive to say the least, especially in games such as Quake III Arena or Unreal Tournament 2003, not to mention the simple logistics of creating motion blur for an object that doesn't travel a predetermined path.
Picture it; Your playing your favourite First Person Shooter game and about to go for that head shot. Your running at 30 fps and the game has added motion blur, or if you like, a trail of images that blend from one to the next. With the imagery moving so fast and the motion blur added, there's no way to determine what the exact position of the head were aiming at is actually in. To the eye, it would look to be in more than one place at one point in time. Now try and shoot the correct one.
Conversely, same game, same frame rate, no motion blur. This time, we have a very strange effect, with our enemy appearing to 'warp' or 'flicker' directly from one position to the next. Just because they look to be in that position doesn't mean they actually are however, they could be somewhere in between those two points. Again, go ahead and make your shot.
What we need to do here is compensate for the lack of motion blur, which is done by increasing the frame rate beyond that discernable to the human eye. But just how high do we need to go? This varies from person to person and there's a nice easy way to determine your own threshold. Your computer monitor refreshes the screen at so many frames per second. If you set the screen to say 60Hz and then look at the screen out of the corner of your eye (i.e. don't look directly at it but to one side of it peripherally) you should be able to see the scan lines and refreshes as a flickering of the screen. Imagine looking at that constantly, the word headache springs to mind.
Now increase the refresh rate until you can no longer see those scan lines. Chances are that for Joe average it will be 72Hz or higher. Now remember, this means that the screen is refreshing 72 times every second, or if you like at 72 fps. So it's safe to say that Joe Average needs to get a MINIMUM of 72 fps of perfect pictures streamed to the eye to maintain the illusion of smooth moving imagery. This is the minimum, not the average or the highest, but the minimum. This is why your games have all sorts of options to tweak the graphic details to suit. This is why everyone is out there furiously tweaking and buying new components to increase there frame rates in games. This is why people jump onto the latest drivers for there components, all in an attempt to squeeze out those extra few frames.
As technology and games progress the need for more powerful, faster components grows, and as such websites like this one are testing these components to see just how much of an impact they make to average frame rates in various games. Anti Aliasing, 32bit colour, Anisotropic filtering, Screen Resolution, not to mention the simple fact that everyone is unique and will want to display there game to there own preferences, will all take a toll on your computer's ability to maintain a smooth, high frame rate. That frame rate for Joe average we know to be 72fps or higher.
Conclusion
Now just because we can benchmark Unreal Tournament 2003 at 100 fps average on certain maps, this doesn't mean that we are going to constantly get 100 fps. This just means that for that benchmark, we had an average of 100 fps, with that amount fluctuating higher and lower than that. Higher doesn't matter (least not to the eye) but it's the lowest frame rate were concerned with here. If during the action sequences our frame rate drops all the way to 40 say, we will perceive our enemies popping in and out of existence quickly, from one position to the next. For the most part, it's really only fast paced action games, with a lot of high speed objects that make people want to get the high frame rates. But even the slower paced games can benefit from higher frame rates, with perhaps transitions between screens looking smoother.
Everything I've mentioned here isn't really that in depth, but it should hopefully give you some understanding as to how things work and why. I've not even touched on the subject of 100Hz 'Flicker Free' TV's, the effects of different colours perceived by the eye.
I'm sure some of you out there are thinking, well I'm happy playing games at 30 fps, who are you to turn around and say different? Frankly I'm not. What I am saying is there are people out there who are not happy with 30 fps, and now hopefully you can understand why. CRT Monitors are considered 'Flicker Free' at about 72Hz for a reason, and simply put it's to compensate for the lack of motion blur, afterimages and other trickery we live with every day in TV and Films. The Human Eye is a marvellous, complex and very clever thing indeed, but even that needs a little help now and then. At the end of the day, it's all down to end user preference, but for me personally I prefer a flicker free display and flicker free gaming.
If you have any comments, be sure to hit us up in our forums.
HOME