Oh come on, you know how this thread is going to end.
Fine, I'll help start it off, you've already got "On Fire" anyway.
The human eye doesn't see in FPS, and therefore it's impossible to calculate "how much" it can see, and it varies on a person-to-person basis. Personally, I can easily spot the difference between 30 FPS and 60 FPS, and I can moderately spot the difference between 60 FPS and 120 FPS. 15-25 FPS mostly looks the same to me, and with proper use of motion blurring a steady 25 FPS appears much the same as 30 FPS. I can't really notice the difference between 45 FPS and 60 FPS, though, unless the game's responsiveness is impacted by the lower framerate.
What is "ideal" is also a personal basis. I can live with 25 FPS, but my "ideal" would be 45+ as long as it's steady and not jumping up and down constantly. I believe that anything more than 60 FPS is a bit excessive, though, because there is a point where the smoothness of motion passes a point of 'realistic' expectations and appears too smooth; Partially because game animations are still in the uncanny valley.
If you couldn't tell from the above, I don't particularly care what the framerate of a game is. If it's constantly dropping like in Assassin's Creed III, I'm going to get angry. But, while it might annoy me slightly, I can play something just fine between 20-30 FPS so long as it's steady. I played World of Warcraft for three years getting only 3 FPS in most dungeons and raids.