Meh. As long as the low-fi or lack of graphics (i.e. Dwarf Fortress) don't make the game unreadable, it's fine. Still, it's always amusing to see bleeding-edge enthusiasts blow thousands of bucks on overpriced hardware all so they can go back to playing Minecraft.
"Oh, but I'm different!" they'll say. "I patched my game to support high-resolution textures and I'm using a fan-made shader!"
Doesn't change the fact that it's Minecraft.
I really don't mind, personally. Some of the more striking games I've had the chance to play in the last few years had incredibly simple or primitive graphics. Stuff like Passage, Canabalt, Revenge of the Titans or The Binding of Isaac.
I think the real question should be "What drives bleeding-edge enthusiasts?" Are they really getting the most out of the constructed gameplay *because* they can run it in 1080p with all visual options cranked to the max and a custom .INI file?
I have to run Skyrim on Low since the latest patch, as the ENB series of shader and color saturation mods has become far too taxing for my system. It's also started to run a bit sluggishly in its initial settings even if the executable is supposed to have been patched to handle more than two gigabytes of RAM. In some ways, this patch has been a severe downgrade for me, as I could run the game on High with little to no slowdowns and with concessions made to water reflection levels.
Does this mean my enjoyment of the game has diminished? Of course not. Skyrim manages to create an immersive experience even with its visuals set on "Worse than Dog Shit", and that's largely because the level designers are smart enough to create a world that speaks based on its aesthetics, and not based on how much specular bump-mapping or Full-Scene Anti-Aliasing is being used.