Click here to go back to

FPS: The Quest for more: Ever wondered why it is exactly everyone keeps striving for more frames per seconds from games? Is it simply for braggin rights or is there more to it? After all we watch TV at 30fps and that's plenty.

Date: September 10, 2003
Written By:

Price Search:    for    


Anyone who even remotely follows PC hardware websites/magazines will have noticed that a lot of the high end items for computers are tested for there ability to improve frame rates in games. You've probably seen in forums or heard from your friends about the latest drivers for there system, and how many extra frames per second (or FPS) they got from them. About how the Latest Video Card supports such and such and gave them an extra 30 fps over there old card. Maybe you've just accepted all this as nothing more than a pissing contest but is there a real reason? Graphics cards and CPU's that can run 100 fps or more in UT2K3, 150 fps or more in Jedi Knight II, and not to forget Quake III Arena pushing upwards of 300 fps. So why do we do it? Apart from the obvious answer (because we can!) what is the actual point of it all? I mean, let's face it; we all watch TV at around 30 fps or less, right?

Quick Biology Lesson

I'm not a doctor, but I'm going to try and attempt (big emphasis on attempt) to explain a little bit about the human eye, as it directly relates to what were talking about here, frames per second. The Human Eye is made up of various components with different functions for each.

The Human Eye

The Cornea is the curved surface at the front of the eye where refraction occurs.

The Lens of the eye is for corrections in focus at different distances.

The Pupil is the hole through which light enters, with the Iris being the coloured part that adjusts in size to accommodate the amount of light entering the eye.

The Retina is situated at the back of the eye on the inside surface. It's constructed with an arrangement of light sensitive receptors called Rods and Cones.

Rods interpret position and intensity of light, and are essentially colour blind. Rods are fast and efficient.Cones are the part that determine colours; red, green and blue. Cones are more complex than Rods, and as such are not as fast. Cones make up a large proportion of the centre of the Retina.

At the very back of the eye is the Optic Nerve, which is the part that transmits the information perceived by the eye to the brain.

How we see the world around us is that light enters the eye with each of the above parts playing there part in filtering what we see. All the above parts process that information into electrical signals which are passed on to the Optic Nerve. All of the information entered through the eye and transmitted along the Optic Nerve is streamed continuously to the Visual Cortex. Now if we think of the brain as a really big hard drive, just like a hard drive the brain has only so much storage room to process the information received from the eye. Because of this, the Visual Cortex has a few tricks up its sleeve to allow us to receive the most information in the smallest and most efficient manner. The main one that's relevant to us is Motion Blur.

Motion Blur

If we look at a brick wall, it's not moving and will look the same to us no matter how many frames per second we are looking at it. We can see all the details available to us because it's a stationary object and the various parts of the eye don't have to work too hard. Now then, same brick wall, but this time, were going to jump onto a bike and ride past it. The faster we go the less detail we can see, and the more blurred the wall looks. This is the Visual Cortex adding motion blur to perceived imagery so that rather than seeing everything in great detail, we are still able to perceive the effect of motion and direction as we ride by. The imagery is smoothly flowing from one point to the next and there are no jumps or flickering to be seen. If the eye wasn't to add this motion blur, we would get to see all of the details still but the illusion of moving imagery would be lost on us, with the brick wall sort of fading in and out to different points. It's pretty simple to test this.

Need a hand?

Take your hand and hold it in front of your face, palm towards you, and your fingers together. On the palm of your hand you can see all the lines and creases, the subtle differences in skin-tone; you may even be able to see a few veins at the joints of your fingers. Now move your hand slowly back and forth in front of you. You can still see the lines and creases, but the subtle shades in skin tone are less perceptible, and the veins have disappeared from perception completely.

You can also still see the separations between fingers. Now move it back and forth fast. Gone are the lines and creases, the veins and the skin tones, replaced with a blurred image of your hands shape, filled with the overall colouring of your skin. You also see trails from your hand following it. Move your hand fast enough and you can perceive your hand going back and forth and merging with those trails. But it's moving smoothly, no stops and starts like a snapshot or one of those picture books you can flick through.

What's happening is that you simply don't have the room to process the information fast enough and to make sure the world we perceive around us is smooth and flowing, motion blur is added whilst details are dropped. Without the motion blur, the world around us would be a very different environment, with fast moving objects popping in and out of existence at high detail (damn lag &.), and making it very difficult for us to determine direction.

Let's go to the movies!

Ahh, the cinema, the big screen, the flicks (interesting slang word, the flicks &). Isn't it great? All that moving imagery goodness on a huge screen, your eye's naturally drawn to it since the rest of the theatre is pitch black. Ever thought about why it's so dark in there?

I've seen the light!

Movies run at 24 fps and they look perfectly smooth so surely 24 fps is enough for moving imagery to be perceived by the eye, right?

Ever had someone shine a bright light into your eyes? When they take the light away, you can still see an afterimage of that light for a bit. As the light surrounding you deepens the more the afterimage makes an impression on your retina. The same effect happens in the Cinema so that you perceive an afterimage of the previous frame, which to your mind is blended in with the next frame.

On the big screen, the image is projected in its entirety, one complete frame at a time, which in turn gives us an afterimage effect.

Films also have motion blur, so that much like the Visual Cortex, motion blur can help maintain the illusion of smooth moving imagery.

The faster the object, the more motion blur can be seen

Using a combination of the above, we are fooled into believing that the image we see is a smooth, flowing picture. But let's face it; we can't all sit in the dark at home to watch TV or play computer games.


Let's talk TV. PAL runs at about 25 frames per second, whereas NTSC runs at about 30 frames per second. Now regardless of which format we choose here, neither of these is actually at a high enough frame rate to give us the perception of smooth moving imagery. What's that you say? Your TV looks fine? Of course it does, because the moving imagery you are looking at is also displayed at a higher refresh rate. Unlike the big screen, TV's don't display one image after another, but draw the image line by line horizontally, which relates to 60 drawing's or refreshes every second. For NTSC, you have 30 fps but 60 refreshes of the screen per second. This amounts to each frame being drawn twice and therefore we have a higher frame rate.

Again, like the big screen, Motion Blur makes its presence known. Want to see this? Go get an action DVD, anything with fast moving objects. Now pause it whilst that object is moving. Looks blurred doesn't it, yet the DVD has frozen that point of the film on one singular frame.

Captains log ....

Using a succession of moving images, the two refreshes per frame fool us into believing there is two frames for every one frame. With the motion blur the eye believes we are watching a smoothly flowing picture.


Computer Games

You can clearly see the refresh in this picture

Like the TV, your Computer Monitor (if it's a Cathode Ray Tube) refreshes by drawing the screen line by line horizontally, but unlike the TV, a Monitor and Video Card doesn't add extra frames. If your screen draws at 30 fps, you will GET 30 fps. Since these are 30 perfectly rendered frames i.e. no motion blur, your eyes are not fooled in the slightest. Now we have already covered why we need motion blur; to maintain the illusion of smooth moving imagery. So why don't we add motion blur then? If we were to add motion blur in some fashion here, the imagery would be counter productive to say the least, especially in games such as Quake III Arena or Unreal Tournament 2003, not to mention the simple logistics of creating motion blur for an object that doesn't travel a predetermined path.

Picture it; Your playing your favourite First Person Shooter game and about to go for that head shot. Your running at 30 fps and the game has added motion blur, or if you like, a trail of images that blend from one to the next. With the imagery moving so fast and the motion blur added, there's no way to determine what the exact position of the head were aiming at is actually in. To the eye, it would look to be in more than one place at one point in time. Now try and shoot the correct one.

Conversely, same game, same frame rate, no motion blur. This time, we have a very strange effect, with our enemy appearing to 'warp' or 'flicker' directly from one position to the next. Just because they look to be in that position doesn't mean they actually are however, they could be somewhere in between those two points. Again, go ahead and make your shot.

What we need to do here is compensate for the lack of motion blur, which is done by increasing the frame rate beyond that discernable to the human eye. But just how high do we need to go? This varies from person to person and there's a nice easy way to determine your own threshold. Your computer monitor refreshes the screen at so many frames per second. If you set the screen to say 60Hz and then look at the screen out of the corner of your eye (i.e. don't look directly at it but to one side of it peripherally) you should be able to see the scan lines and refreshes as a flickering of the screen. Imagine looking at that constantly, the word headache springs to mind.

Now increase the refresh rate until you can no longer see those scan lines. Chances are that for Joe average it will be 72Hz or higher. Now remember, this means that the screen is refreshing 72 times every second, or if you like at 72 fps. So it's safe to say that Joe Average needs to get a MINIMUM of 72 fps of perfect pictures streamed to the eye to maintain the illusion of smooth moving imagery. This is the minimum, not the average or the highest, but the minimum. This is why your games have all sorts of options to tweak the graphic details to suit. This is why everyone is out there furiously tweaking and buying new components to increase there frame rates in games. This is why people jump onto the latest drivers for there components, all in an attempt to squeeze out those extra few frames.

As technology and games progress the need for more powerful, faster components grows, and as such websites like this one are testing these components to see just how much of an impact they make to average frame rates in various games. Anti Aliasing, 32bit colour, Anisotropic filtering, Screen Resolution, not to mention the simple fact that everyone is unique and will want to display there game to there own preferences, will all take a toll on your computer's ability to maintain a smooth, high frame rate. That frame rate for Joe average we know to be 72fps or higher.


Now just because we can benchmark Unreal Tournament 2003 at 100 fps average on certain maps, this doesn't mean that we are going to constantly get 100 fps. This just means that for that benchmark, we had an average of 100 fps, with that amount fluctuating higher and lower than that. Higher doesn't matter (least not to the eye) but it's the lowest frame rate were concerned with here. If during the action sequences our frame rate drops all the way to 40 say, we will perceive our enemies popping in and out of existence quickly, from one position to the next. For the most part, it's really only fast paced action games, with a lot of high speed objects that make people want to get the high frame rates. But even the slower paced games can benefit from higher frame rates, with perhaps transitions between screens looking smoother.

Everything I've mentioned here isn't really that in depth, but it should hopefully give you some understanding as to how things work and why. I've not even touched on the subject of 100Hz 'Flicker Free' TV's, the effects of different colours perceived by the eye.

I'm sure some of you out there are thinking, well I'm happy playing games at 30 fps, who are you to turn around and say different? Frankly I'm not. What I am saying is there are people out there who are not happy with 30 fps, and now hopefully you can understand why. CRT Monitors are considered 'Flicker Free' at about 72Hz for a reason, and simply put it's to compensate for the lack of motion blur, afterimages and other trickery we live with every day in TV and Films. The Human Eye is a marvellous, complex and very clever thing indeed, but even that needs a little help now and then. At the end of the day, it's all down to end user preference, but for me personally I prefer a flicker free display and flicker free gaming.

If you have any comments, be sure to hit us up in our forums.



Search for lowest prices: