The point where you stop seeing a series of images and start seeing motion is fairly low, around 15-20 fps. Movies are shown at 24 fps, although they may be shot at higher frame rates so that the post-processing department can control the amount of motion blur.
Yup, virtually every movie you've seen was at a lower frame rate than the 30 fps that gets so many complaints on these forums. However, in a movie you know the hero won't die just because the frame rate is low. Another key difference is that it's a fixed 24 fps, while the fps in a game typically depends on the amount of processing that must be done which can vary a lot. If the game stalls for half a second then runs at 60 fps you're still getting 30 fps but your experience will not be satisfactory.
So how can people tell 100 fps from 150 fps? Well it's not bullshit. Consider an single pixel white dot moving from left to right across a dark 1920x1080 screen over a second. At 60 fps, the dot jumps 32 pixels each frame. If you focus on the background, you will notice that only 1 in 32 pixels along the dot's path actually brighten. If you employ motion blur, you will fix that problem but the pixel will appear to be blur that is at least 32 pixels wide. Go up to 120 fps and these errors decrease to 16 pixel errors, 8 pixel errors at 240 fps, 4 pixel errors at 480 fps, 2 pixel errors at 960 fps and it finally looks perfect at 1920 fps. Yes, we need 1920 fps to display that dot properly, and we can apply the same argument to highlights on hi-res textures.
This is why pro gamers tend to notice, because they need to turn and shoot quickly and accurately, which requires them to make sense of their environment as it spins around them and blurs a little even at 300fps. It's also why games look smooth at lower resolutions, such as Doom, which was capped at 35 fps but was still very smooth.