Quote: "To have a *perfect* illusion of everything that can flash, blink and move you shouldn't go below 500 fps....maybe you need as much as 4000fps..."(from link below).
The idea that you can't see more than xx fps so why get more, is an old myth, and a very confused one. It steps up when you need very quick and accurate visual responses - say competitive online FPS play, or flight simulators. I know when I was in a national team for a couple of shooters, the difference between 85 fps and 120 fps could be difference between defeat and victory. The improvement in accuracy was huge! So we'd play at 800x600 with all settings on Lowest, no matter how good our card was. Photographers can tell you that they can see strobe flashes of 1/1000th of a secnd.
The answer is a very long technical dissertation involving many aspects of video and biology.
Short answer: Cinema and computer screens are not the same thing. Even in cinema, there are theoretical gains in viewability up to 48 FPS and maybe beyond, but it's true that most movies look totally fine at 24 fps or even lower. But a modern film projector is actually showing 3 real frames for every 1 apparent frame, with interleaved blackness to hide projector motion, so they're actually closer to 72 fps anyway.
But for computers, well, if you ever had an old CRT, you know the difference between a 60 Hz and 85 Hz refresh rate was astonishing, ergonomically.
From Dans Rutter's DansData site www.dansdata.com:
"If a CRT monitor screen's painted less often than about 75 times per second (refresh rate is measured in Hertz, or Hz), it'll seem to flicker, because there's not enough persistence in the phosphor to keep the screen illumination even from your point of view. You need a 75Hz or better refresh rate to eliminate the flicker.
TVs have a low frame rate - 25 frames per second for PAL, 30 frames per second for NTSC - but they get away with it because they use interlaced mode, scanning all of the odd numbered lines and then all of the even numbered ones, so a 25 frame per second refresh rate becomes a 50 "field" per second screen-painting rate.
Sure, only half of the screen's painted each time, but it's not the top half and then the bottom half - it's a Venetian-blind interleaved pattern, that means the whole thing looks pretty evenly illuminated. This, combined with the higher persistence phosphor, gives a decently flicker-free display.
And, of course, you usually watch TV from far enough away that the screen takes up less of your field of view than does your closely-viewed computer monitor. Smaller images seem to flicker less.
24 frame per second movies, on the other hand, get away with their low frame rate without looking painfully flickery because the whole frame's illuminated in one go each time. And LCD panels work the same way."
LCDs with video games are more complicated, because as the data stream from the video card is organised into lines of pixel states, not whole picture frames, you need to start to talk about things like flicker-rates and de-interlacing algorithms. But if you have ever played a game at 75 FPS and then tried it on your friends PC at 15 FPS, well, again you'd know the difference.
For a more in depth (but still introductory) discussion of the matter, this article, from where I obtained the opening quote:
http://www.100fps.com/how_many_frames_can_humans_see.htm
is good. Basically, the more frames you can get there in a second, the better, but YMMV.
The idea that you can't see more than xx fps so why get more, is an old myth, and a very confused one. It steps up when you need very quick and accurate visual responses - say competitive online FPS play, or flight simulators. I know when I was in a national team for a couple of shooters, the difference between 85 fps and 120 fps could be difference between defeat and victory. The improvement in accuracy was huge! So we'd play at 800x600 with all settings on Lowest, no matter how good our card was. Photographers can tell you that they can see strobe flashes of 1/1000th of a secnd.
The answer is a very long technical dissertation involving many aspects of video and biology.
Short answer: Cinema and computer screens are not the same thing. Even in cinema, there are theoretical gains in viewability up to 48 FPS and maybe beyond, but it's true that most movies look totally fine at 24 fps or even lower. But a modern film projector is actually showing 3 real frames for every 1 apparent frame, with interleaved blackness to hide projector motion, so they're actually closer to 72 fps anyway.
But for computers, well, if you ever had an old CRT, you know the difference between a 60 Hz and 85 Hz refresh rate was astonishing, ergonomically.
From Dans Rutter's DansData site www.dansdata.com:
"If a CRT monitor screen's painted less often than about 75 times per second (refresh rate is measured in Hertz, or Hz), it'll seem to flicker, because there's not enough persistence in the phosphor to keep the screen illumination even from your point of view. You need a 75Hz or better refresh rate to eliminate the flicker.
TVs have a low frame rate - 25 frames per second for PAL, 30 frames per second for NTSC - but they get away with it because they use interlaced mode, scanning all of the odd numbered lines and then all of the even numbered ones, so a 25 frame per second refresh rate becomes a 50 "field" per second screen-painting rate.
Sure, only half of the screen's painted each time, but it's not the top half and then the bottom half - it's a Venetian-blind interleaved pattern, that means the whole thing looks pretty evenly illuminated. This, combined with the higher persistence phosphor, gives a decently flicker-free display.
And, of course, you usually watch TV from far enough away that the screen takes up less of your field of view than does your closely-viewed computer monitor. Smaller images seem to flicker less.
24 frame per second movies, on the other hand, get away with their low frame rate without looking painfully flickery because the whole frame's illuminated in one go each time. And LCD panels work the same way."
LCDs with video games are more complicated, because as the data stream from the video card is organised into lines of pixel states, not whole picture frames, you need to start to talk about things like flicker-rates and de-interlacing algorithms. But if you have ever played a game at 75 FPS and then tried it on your friends PC at 15 FPS, well, again you'd know the difference.
For a more in depth (but still introductory) discussion of the matter, this article, from where I obtained the opening quote:
http://www.100fps.com/how_many_frames_can_humans_see.htm
is good. Basically, the more frames you can get there in a second, the better, but YMMV.