Because it's always a trade-off.
I'm one of those people that does both, but because PC's are so flexible, I often experiment with settings.
What framerate I will put up with really depends on the game.
Some games are really, really bad at low framerates.
Even console games. (One that comes to mind is bit.trip beat - Even on the Wii, where the only 'framerate' choices are Pal 50hz, vs 60 hz mode, 60 hz is better for gameplay)
Thing is, in a lot of games 60 fps just doesn't get you anything.
And the price you pay is reduced graphical quality.
Less detailed meshes, lower resolution textures (lower resolutions generally), lower quality effects... Etc.
As an example, on PC, with a mid-range system playing, say, the original Crysis,
Going from 30 fps to 60 is the difference between playing on one of the highest graphical settings versus one of the lowest.
The improvements in framerate come at such a huge cost to visual quality, it's clearly only something you'd do if you were absolutely desperate, or if it was having a very large, and very obvious negative effect on being able to play the game...
I for one, don't like making that trade-off, so I often opt for lower framerates, but higher visual quality.
(Except for AA. AA can suck it, mostly. I mean, yes, it makes things look better, usually, but typically at the expense of AT LEAST a 15 fps drop, in my experience. (And often a lot more). Besides which, if you have the choice between AA and simply running the game in a higher resolution, higher resolutions win every time. Why play a game at 1280x720 with 4x AA when you can play it at 2560x1600 with AA turned off?)