FPS : Human eye can see how many frames per second?
Human Eye Frames Per Second
02/21/2001 10:30:00 AM MST Albuquerque, Nm
By Dustin D. Brand; Owner AMO
How many frames per second can our wonderful eyes see?
This article is dedicated to a friend of mine, Mike.
There is a common misconception in human thinking that our eyes can only interpret 30 Frames Per Second. This misconception dates back to the first human films where in fact a horse was filmed proving actually that at certain points they were resting on a single leg during running. These early films evolved to run at 24 Frames Per Second, which has been the standard for close to a century.
A Movie theater film running at 24 FPS (Frames Per Second) has an explanation. A Movie theater uses a projector and is projected on a large screen, thus each frame is shown on the screen all at once. Because Human Eyes are capable of implementing motion blur, and since the frames of a movie are being drawn all at once, motion blur is implemented in such few frames, which results in a lifelike perceptual picture. I'll explain the Human Eye and how it works in detail later on in this multi-page article.
Now since the first CRT TV was released, televisions have been running at 30 Frames Per Second. TV's in homes today use the standard 60Hz (Hertz) refresh rate. This equates to 60/2 which equals 30 Frames Per Second. A TV works by drawing each horizontal line of resolution piece by piece using an electron gun to react with the phosphors on the TV screen. Secondly, because the frame rate is 1/2 the refresh rate, transitions between frames go a lot smoother. Without going into detail and making this a 30 page article discussing advanced physics, I think you'll understand those points.
Moving on now with the frame rate. Motion blur again is a very important part to making videos look seamless. With motion blur, those two refreshes per frame give the impression of two frames to our eyes. This makes a really well encoded DVD look absolutely incredible. Another factor to consider is that neither movies or videos dip in frame rate when it comes to complex scenes. With no frame rate drops, the action is again seamless.
Computer Games and their industry driving use of Frames Per Second
It's easy to understand the TV and Movies and the technology behind them. Computers are much more complex. The most complex being the actual physiology /neuro-ethology of the visual system. Computer Monitors of a smaller size are much more expensive in cost related to a TV CRT (Cathode Ray Tube). This is because the phosphors and the dot pitch of Computer Monitors are much smaller and much more close together making much greater detail and much higher resolutions possible. Your Computer Monitor also refreshes much more rapidly, and if you look at your monitor through your peripheral vision you can actually watch these lines being drawn on your screen. You can also observe this technology difference by watching TV where a monitor is in the background on the TV.
A frame or scene on a computer is first setup by your video card in a frame buffer. The frame/image is then sent to the RAMDAC (Random Access Memory Digital-Analog-Convertor) for final display on your display device. Liquid Crystal Displays, and FPD Plasma displays use a higher quality strictly digital representation, so the transfer of information, in this case a scene is much quicker. After the scene has been sent to the monitor it is perfectly rendered and displayed. One thing is missing however, the faster you do this, and the more frames you plan on sending to the screen per second, the better your hardware needs to be. Computer Programmers and Computer Game Developers which have been working strictly with Computers can't reproduce motion blur in these scenes. Even though 30 Frames are displaying per second the scenes don't look as smooth as on a TV. Well that is until we get to more than 30 FPS.
NVIDIA a computer video card maker who recently purchased 3dFx another computer video card maker just finished a GPU (Graphics Processing Unit) for the XBOX from Microsoft. Increasing amounts of rendering capabilities and memory as well as more transistors and instructions per second equate to more frames per second in a Computer Video Game or on Computer Displays in general. There is no motion blur, so the transition from frame to frame is not as smooth as in movies, that is at 30 FPS. In example, NVIDIA/3dfx put out a demo that runs half the screen at 30 fps, and the other half at 60 fps. The results? - there is a definite difference between the two scenes; 60 fps looking much better and smoother than the 30 fps.
Even if you could put motion blur into games, it would be a waste. The Human Eye perceives information continuously, we do not perceive the world through frames. You could say we perceive the external visual world through streams, and only lose it when our eyes blink. In games, an implemented motion blur would cause the game to behave erratically; the programming wouldn't be as precise. An example would be playing a game like Unreal Tournament, if there was motion blur used, there would be problems calculating the exact position of an object (another player), so it would be really tough to hit something with your weapon. With motion blur in a game, the object in question would not really exist in any of the places where the "blur" is positioned, that is the object wouldn't exist at exactly coordiante XYZ. With exact frames, those without blur, each pixel, each object is exactly where it should be in the set space and time.
The overwhelming solution to a more realistic game play, or computer video has been to push the human eye past the misconception of only being able to perceive 30 FPS. Pushing the Human Eye past 30 FPS to 60 FPS and even 120 FPS is possible, ask the video card manufacturers, an eye doctor, or a Physiologist. We as humans CAN and DO see more than 60 frames a second.
With Computer Video Cards and computer programming, the actual frame rate can vary. Microsoft came up with a great way to handle this by being able to lock the frame rate when they were building one of their games (Flight Simulator).
The Human Eye and it's real capabilities - tahDA!
This is where this article gets even longer, but read on, please. I will explain to you how the Human Eye can perceive much past the mis conception of 30 FPS and well past 60 FPS, even surpassing 200 FPS.
We humans see light when its focused onto the retina of the eye by the lens. Light rays are perceived by our eyes as light enters - well, at the speed of light. I must stress the fact again that we live in an infinite world where information is continuously streamed to us. Our retinas interpret light in several ways with two types of cells; the rods and the cones. Our rods and cells are responsible for all aspects of receiving the focused light rays from our retinas. In fact, rods and cones are the cells on the surface of the retina, and a lack thereof is a leading cause of blindness.
Calculations such as intensity, color, and position (relative to the cell on the retina) are all forms of information transmitted by our retinas to our optic nerves. The optic nerve in turn sends this data through its pipeline (at the nerve impulse speed), on to the Visual Cortex portion of our Brains where it is interpreted.
Rods are the simpler of the two cell types, as it really only interprets "dim light". Since Rods are light intensity specific cells, they respond very fast, and to this day rival the quickest response time of the fastest computer. Rods control the amount of neurotransmitter released which is basically the amount of light that is stimulating the rod at that precise moment. Scientific study has proven upon microscopic examination of the retina that there is a much greater concentration of rods along the outer edges. One simple experiment taught to students studying the eye is to go out at night and look at the stars (preferably the Orion constellation) out of your peripheral vision (side view). Pick out a faint star from your periphery and then look at it directly. The star should disappear, and when you again turn and look at it from the periphery, it will pop back into view.
Cones are the second retina specialized cell type, and these are much more complex. Cones on our retinas are the RGB inputs that computer monitors and graphics use. The three basic parts to them absorb different wavelengths of light and release differing amounts of different neurotransmitters depending on the wavelength and intensity of that light. Think of our cones as RGB computer equivalants, and as such each cone has three receptors that receive red, green, or blue in the wavelength spectrum. Depending on the intensity of each wavelength, each receptor will release varying levels of neurotransmittor on through the optic nerve, and in the case of some colors, no neurotransmitter. Due to cones inherent 3 receptor nature vs 1, their response time is less than a rods due to the cones complex nature.
Our Optic nerves are the visual information highway by which our lens, then retina with the specialized cells transmit the visual data on to our Brains Visual Cortex for interpretation. This all begins with a nerve impulse in the optic nerve triggered by rhodospin in the retina, which takes all of a picosecond to occur. A picosecond is one trillionth of a second, so in reality, theoretically, we can calculate our eyes "response time" and then on to theoretical frames per second (but I won't even go there now). Keep reading.
The optic nerves average in length from 2 to 3 centimeters, so its a short trip to reach our Visual Cortex. Ok, so like the data on the internet, the data traveling in our optic nerves eventually reaches its destination, in this case, the Visual Cortex - the processor/interpretor.
Unfortunately, neuroscience only goes so far in understanding exactly how our visual cortex, in such a small place, can produce such amazing images unlike anything a computer can currently create. We only know so much, but scientists have theorised the visual cortex being a sort of filter, and blendor, to stream the information into our conciousness. We're bound to learn, in many more years time, just how much we've underestimated our own abilities as humans once again. Ontogoney recapitulates phylogeny (history repeats itself).
There are many examples to differentiate how the Human Visual System operates differently than say, an Eagles. One of these examples includes a snowflake, but let me create a new one.
You're in an airplane flying looking down at all the tiny cars and buildings. You are in a fast moving object, but distance and speed place you above the objects below. Now, lets pretend that a plane going 100 times as fast quickly flys below you, it was a blur wasn't it?
Regardless of any objects speed, it maintains a fixed position in space time. If the plane that just flew by was only going say, 1 times faster than you, you probably would have been able to see it. Since your incredible auto focus eye had been concentrated on the ground before it flew below, your visual cortex made the decision that it was there, but well, moving really fast, and not as important. A really fast camera with a really fast shutter speed would have been able to capture the plane in full detail. Not to limit our eyes ability, since we did see the plane, but we didn't issolate the frame, we streamed it relative to the last object we were looking at, the ground, moving slowing below.
Our eyes, technically, are the most advanced auto focus system around - they even make the cameras look weak. Using the same scenario with an Eagle in the passenger seat, the Eagle, due to its eyes only using Rods, and its distance to its visual cortex being 1/16 of ours wouldn't have seen as much blur in the plane. However, from what we understand of the Visual Cortex, and Rods and Cones, even Eagles can see dizzy blurry objects at times.
What is often called motion blur, is really how our unique vision handles motion, in a stream, not in a frame by frame. If our eyes only saw frames (IE: 30 images a second), like a single lens reflex camera, we'd see images pop in and out of existance and that would really be annoying and not as advantagous to us in our three dimensional space and bodies.
So how can you test how many Frames Per Second we as Humans can see?
My favorite test to mention to people is simply to look around their environment, then back at their TV, or monitor. How much more detail do you see vs your monitors? You see depth, shading, a wider array of colors, and its all streamed to you. Sure, we're smart enough to use a 24 frame movie and piece it together, and sure we can make real of video footage filmed in NTSC or PAL, but can you imagine the devices in the future?
You can also do the more technical and less imaginative tests above, including the star gazing, and this tv/monitor test. A TV running at only 30 FPS is picking up a Computer monitor in the background in its view, and with the 30 FPS TV Output you see the screen refreshes on the computer monitor running at 60 FPS. This actually leads to eyestrain with computer monitors but has everything to do with lower refresh rates, and not higher.
Don't underestimate your own eyes Buddy...
We as humans have a very advanced visual system, please understand that a computer with all it's processor strength still doesn't match our own brain, or the complexity of a single Deoxyribonucleic Acid strand. While some animals out there have sharper vision than us humans, there is usually something given up with it - for eagles there is color, and for owls it is the inability to move the eye in its socket. With our outstanding human visual, we can see in billions of colors (although it has been tested that women see as much as 30% more colors than men do. Our eyes can indeed perceive well over 200 frames per second from a simple little display device (mainly so low because of current hardware, not our own limits). Our eyes are also highly movable, able to focus in as close as an inch, or as far as infinity, and have the ability to change focus faster than the most complex and expensive high speed auto focus cameras. Our Human Visual system receives data constantly and is able to decode it nearly instantaneously. With our field of view being 170 degrees, and our fine focus being nearly 30 degrees, our eyes are still more advanced than even the most advanced visual technology in existance today.
So what is the answer to how many frames per second should we be looking for? If current science is a clue, its somewhere in sync with full saturation of our Visual Cortex, just like in real life. That number my friend - is - well - way up there with what we know about our eyes and brains.
It used to be, well, anything over 30 FPS is too much. (Is that why you're here, by chance?) :) Then, for a while it was, anything over 60 is sufficient. After even more new video cards, it became 72 FPS. Now, new monitors, new display types like organic LEDS, and FPDs offer to raise the bar even higher. Current LCD monitors response rates are nearing the microsecond barrier, much better than millisecond, and equating to even more FPS.
If this old United States Air Force study is any clue to you, we've only scratched the surface in not only knowing our FPS limits, and coming up with hardware that can match, or even approach them.
The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft. This simple and specific situation not only proves the ability to percieve 1 image within 1/220 of a second, but the ability to interpret higher FPS.
_______________________________________
so, just how many frames per second can our human eye see past 100?
In my previous article (Human Eye Frames Per Second), I mentioned I'd have another to settle once and for all just how many frames per second our human eye is capable of seeing, so here we are.
If you havn't read my first article, do so now, it's quite lengthy, but worth your time in learning, plus it's enough foundation to start here.
Motion Blur is so important in movies and TV programming
In my first article, I mentioned how important motion blur is pertaining to frames per second. On Computers, this is essentially non-existant. Motion blur in movies, which run at 24 frames per second are designed for the big screen projector, which blasts movies to the screen, each frame in it's entirety in the widescreen format one frame at a time. Because each frame is filmed in a certain way, motion blur is used, meaning the frames are not perfectly clear, they contain blur.
The blur used in todays movies will eventually be replaced by completely digital movies (on very expensive screens, I should know, I worked with the technology at age 16), and with the advent of computer animation in movies, the process of replacing the blur on the film in movies is becoming more and more inevitable.
Computer's don't work this way (with blur that is), and essentially neither does anything digital. With digital, you either have an exact perfectly clear image, or an exact perfectly blur image like in movies. From the transition from movies to the TV, or DVD digital, an extra 4 frames are added each second in a method called frame mixing, just to match correctly the device it's being displayed on, your TV. NTSC(American) and PAL(european) use different kinds of TV formats, each with different refresh rates and resolutions. 640x480 for NTSC and 800x600 lines for PAL. With HDTV, everything is digital, and essentially 60 frames now, but most of these broadcasts use frame mixing, and until 2006 you won't need to trash your regular TV, though it may be a good idea now.
As many of you know, pause a DVD film movie during movement, or if you can a TV with your VCR and you'll see the blur (unless the image is static to begin with). Pause an animation DVD, or a cartoon on TV and you won't see the blur. Why is this so? Filmed movies, and Filmed TV shows work by bluring their subjects, actors, actresses, whatever. Filmed movies and TV are not taking a PERFECT snapshot image of the subject, each image is a blur, and a blur to the next giving the impression that everything is moving seemlessly (if nothing is moving in the scene, you can see a static image). In an animation or a cartoon, each frame or image of the 24/30 frames per second is perfect, there is no blur in the image - EVER.
I touched very briefly on Auto Focus Cameras, and even the best most expensive cameras not even coming close to matching the capabilities of our human eye in focusing. The professional cameras you see reporters with are capable of taking pictures of EXTREMELY fast moving objects in perfectly still quality at and above 1/4000 of a second. What does a camera being able to take 4000 pictures in a second prove?
Our infinitely seamless world.
Professional cameras can take perfectly still pictures without any blur, and like in the case of video cameras, pictures with blur. So where is the limit? How quick can we take a picture, and how slow can we take a picture? SLOW time progressed pictures have been taken, you've probably seem them at night where all the cars tail lights are in a streak. You've probably also seen the "Photo finish" camera's take the winning tell tale sign of a close horse race. What all of this really means is that unless we slow time, or speed it up, there isn't any blur in our world. That is of course unless you're drunk, the room is spinning, or you're on some LSD trip. Ok besides that.
Images in our world are infinitely streamed to us as I've said before. Living in this 3rd dimension as we do, our eyes able us to see depth/periphery, we can focus in very close, and as far as infinity. So is there really a limit to how many frames per second we can really see with our eyes?
Our limit, is there one?
Until someone proves me, all the scientists, optometrists, and the like wrong, there is no limit to how many frames per second our human eye can see. Theoretical limit yes, proven limit, NO.
Think for just a second how dumb it would be to push the limit on video displays, devices and the like if our eyes couldn't tell the difference between an HDTV and a plain old TV or a Computer monitor and a Plasma display. Ok, in that second how many times do you think your eye "framed" this screen? The number of times the screen refreshed? Nope, the number of times your eye streamed this page to you, it's a number that is potentially infinite, or at least until we understand the complexity of our own mind. Just know that this number is much, much higher than what your monitor is capable of currently displaying to you, that is matching your own interpretation.
Our Brain is smart enough however to "exact" 24 frames into motion, isn't it ignorant to say we can't distinguish 400, or even 4000 into motion? Heh the skies the limit, oh wait, then space...oh wait. Give us more, we notice the difference from 30-60, the difference from 60-120. It is possible the closer we get to our limit, be there one, the harder it is to get there, and there is a theory about this. Someone is across the room. Take one full step towards them. Now 1 half step towards them, then 1 half step of a half step, on and on until your 1 half of each movement you take. Will you ever get there? That my friend is open to debate, but in the mean time, will you take one step towards me?
The Human Eye perceiving 220 Frames Per second has been proven, game developers, video card manufacturers, and monitor manufacturers all admit they've only scratched the surface of Frames Per Second. With a high quality non-interlaced display (like plasma or a large LCD FPD) and a nice video card capable of HDTV resolution, you can today see well above 120 FPS with a matching refresh rate. With some refresh rates as high as 400Hz on some non-interlaced displays, that display is capable of 400 FPS alone. Without the refresh rate in the way, and the right hardware capable of such fast rendering (frame buffer), it is possible to display as cameras are possible of recording 44,000 Frames Per Second. Imagine just for a moment if your display device were to be strictly governed by the input it was receiving. This is the case with computer video cards and displays in a way with adjustable resolutions, color depth, and refresh rates.
Test your limit, you tell me...
Look at your TV, or ANY image device, then look at the device not looking at the image it is displaying, for example the TV itself, or the Monitor itself. Tell me the image on the screen is more clear, more presise than the image of the TV or the monitor itself. You can't, that's why the more frames per second, the better, and the closer to reality it really appears to us. With 3d holograms right around the corner, the FPS subject or maybe 3DFPS will become even more important.
The real limit is in the viewing device, not our eyes.
The real limits here are evidenced by the viewing device, not our eyes, we can consistently pick up the flicker to prove that point. In Movies the screen is larger than life, and each screen is drawn instantaneously by the projector, but that doesn't mean you can't see the dust or scratches on each frame. With NTSC and PAL/SECAM TV's, each line is drawn, piece by piece (odd, then even lines) for each frame and refreshes at the Hertz. The frames displaying because of this is exactly the hertz divided by 2 or (odd line 1 hertz then even line 1 hertz). Do a search for high-speed video cameras and you'll find some capable of 44,000+ frames per second, that should give you a clue.
CRT's be it PC monitors or TV's have to refresh with rates, known as the Hertz. Eye fatigue can happen because of the probe or line effect that happens after prolonged viewing, yes your eye sees this. Switch to your Periphery vision like I gave an example for in my first article and you can see the refresh rate. 60Hz and 50Hz also happens to be the frequency of the main power of the countries that use these Hertz in the TV refresh rates. Because of the way the technology works, by drawing each line individually, your Frame Rate/Refresh rate (not your FPS) is tied to your FPS. If something is running at 60 FPS however your monitor is at 60 Hertz and is interlaced, which TV's are locked at, you're seeing 30 Frames Per Second. However, if you have a nice computer monitor (NON-INTERLACED), and it's set to 120Hertz (72+ is considered "flicker free"), and your video is running at 120 Frames Per Second, you're seeing exactly 120 Frames Per Second. You may have heard that LCD's or Liquid Crystal Displays are "flicker free". LCD displays are capable of showing their FPS in a refresh rate, much like non-interlaced monitors are, example 75 Hertz is capable of 75 Frames Per Second. Technically, because an LCD pixel/transistors is either true or false, this technology is not only better, but faster than an electron gun on a phosphor like in a CRT, thus virtually eliminating flicker.
Technically speaking: NTSC has 525 scan lines repeated 29.97 times per second = 33.37 msec/frame or roughly 30 Frames Per Second at 60Hz BECAUSE it's INTERLACED.
Technically speaking: PAL has 625 scan lines repeated 25 times per second = 40 msec/frame or exactly 25 Frames Per Second at 50Hz BECAUSE it's INTERLACED.
So how does 60Hertz relate in HDTV's? Well, with progressive scanning (the XBOX supports this with it's NVidia GPU), each frame is drawn on each pass meaning 60Hz supports 60 Frames Per Second, but as you've learned although the hertz and FPS are related, the hertz of the display does not necessarily mean that it is the frames per second. Frames per second are determined by the display device and how it draws each frame. Normal TV's don't support progressive Scan and thus redraws half the screen on each pass, first draws the odd lines (interlaced), then the even = 30 Frames Per Second maximum.
As you've seen, it's not our human eyes, it's the display. More on this is the fact between interlaced and non-interlaced monitors. All computer CRT monitors are now made non-interlaced (and have been for quite some time), meaning the entire frame is refreshed at the refresh rate or Hertz. The frame is scanned all at once, thus the refresh rate can equal the Frames Per Second, but the Frames Per Second isn't going to go past the Refresh Rate because it's not possible on the display. Just because a video card is pushing 200 Frames Per Second, your display may be at 100Hz meaning it's only refreshing 100 times per second.
Thus, the big misconception that our eyes can only see 30 frames or 60 frames per second is purely due to the fact that the mainstream displays can only show this, not that our eyes can't see more. For the time being, the frames per second capable of any display device isn't even close to the phrase "more than meets the eye".
Definitions of relevance:
CRT Cathode Ray Tube - The tube or flat tube making up a TV which utilizes an electron gun to manipulate phosphors at the front of the tube for varying color.
NTSC originally developed in the United States by a committee called the National Television Standards Committee (525 lines).
PAL standing for Phase Alternate Lines (625 lines)
FPS - Frames Per Second - A Frame consists of an image completely drawn to a viewing device, example: Monitor
http://amo.net/nt/05-24-01FPS.html