It'd certainly be different, but if you warped into another dimension where videogames simply hadn't took off, you may not immediately notice anything obvious. Computer graphics and sound in everyday use machines reached a fair plateau a HELL of a long time ago. I'm currently typing this on a SXGA, 24bit colour monitor with a machine that offers 16 bit stereo (or if i want it, 4/6 channel) sound at 48khz or more, a CPU with enough power to produce multivoice MIDI/soundtracker music at a quality that's long since made wavetables and hardware synths obsolete, and enough memory/hard drive space that even those are a bit pointless, as you just load your soundtrack up with mp3s. Ten years ago I had all that capability, just not the space for a suitable monitor to go beyond XGA. Twelve, could have done most of it with a few upgrades, if I wasn't a broke-ass schoolkid. Fifteen... would have been a struggle, but enough investment would still have seen me right. You'd have to go back twenty for the relevant tech to not actually exist yet, and even so the highest-end Amigas would probably have made you look round with a subtle but insistent clearing of the throat and pointed out that but for some insignificant tweaks of the figures, they were practically there...
A bigger question may be whether computers would have gotten quite the early home foothold that they did. Game consoles and games on general-purpose computers weren't the be-all and end-all of people having home computers, and in fact came AFTER the first ones (the Altair etc). But they were definitely a MAJOR driving force behind their popularity. The industry would have grown more slowly as people realised their productivity benefits, and older machines would have gone out of date far more slowly ... which, given that I knew someone still using a Sinclair Spectrum in 1990, I last "properly" used my Atari ST sometime around 94/95, and there's still a dwindling population of BBC Micros and IBM 5150s finding useful employment, makes me wonder if we'd have hit that plateau yet.
However, serious productivity software actually demands at least as much of the machine as games do (desktop publishing, for example, will punish a late-80s 16 bit computer far harder than a round of a typical game of the time in all areas except the soundcard and fancy programmer-tricks with the video shifter, and show up its limitations far sooner), so the development of the cutting-edge hardware may not have been so affected. There'd just be less money available to push forwards so quickly, there'd have remained a bigger focus on mainframe servers with client terminals in businesses ... something we're now slowly coming back round to with virtualisation.
I get far more use out of my computer for "serious" stuff than gaming, and when I do, I emulate a lot anyway. But I wouldn't much want to go back to 8-bit colour SVGA which was the monitor setting of choice with our bog-standard 1994 PC (it'd do high colour, but only with flicker and choppiness; XGA, but it was blurry; and true colour, but only at VGA)... it's dreadfully cramped, and the level of detail you have for editing documents is pretty naff. I've struggled on with a VGA laptop making documents when that's all my emergency fund could stretch to - the XGA one following it was far easier.
CPUs would have sped up anyway as their development is well divorced from gaming - consoles rarely use cutting edge processors etc. And everyone always wants a faster chip, to make up for the more advanced things the bleeding edge enthusiasts find to do with them. Large spreadsheets or databases require a fair old bit of bit-mangling, as do word processed documents with vector fonts etc. Then you get on to the whole business of the CDROM encyclopaedia, which brings primitive video ... everyone sees how primitive it is and demands better ... which requires a better chip. (First time I saw encarta videos was on a 386-16 ... it was intriguing, but awful... our school library got better computers SOON after)
Or at least an Amiga style coprocessor. But you rarely do anything else at the same time, and that extra power is just handy generally.
Computer aided design may have remained the preserve of mainframes and minicomputers, but not so computer-aided art. That would also be another driver of faster chips, more memory, better drives and better displays. The revolutionary nature of bringing digital technology to bear on the traditional arts cannot be overstated. But you can't do much in that regard with a C64. You need a miggy at the very least.
And along with that would come music ... games would actually have been a big music driver, so anything more than beeps may have been slower to arrive, but once enough people were using MIDI and thinking "what if we brought that sampler onboard?", or trying to make videos and other animations, or just play their music CDs, some kind of decent hardware would have come along. Probably not my beloved AY2192 or the Paula, but once you've got even a 4-bit mono output at 8khz, you can start to mess with the bitstream...
(Remember that the helper chips in the miggy were general use, not game focussed, and it was initially sold as a media productivity machine because there weren't many games about that properly used the hardware)
The one thing we'd likely be missing is serious 3D acceleration. The 3D graphics required in most non-game situations can be easily covered by CPU calculation. Hyper-real rendering as seen in current top-end titles would be the preserve of silicon-graphics style workstations and cabinet-size supercomputers, or people with enough of an interest and sufficient patience to indulge in raytracing.
The mistake I think, as above, is assuming that videogames have been at the hardware cutting edge very much other than in the last 15 years or so. Certainly PC and home computer ones didn't often push the limits after the 8-bit days, except in the very most expansive or graphically/musically rich, and generally solid-3D ones. And remember that most 8-bits other than the Apple II weren't actually expandable in any meaningful way. The game designers wrote within the hardware boundaries, rather than forcing them outwards gradually as has been the case recently.
And consoles? Heh. Really. Apart from maybe the NES, SNES, Jaguar and PSX AT THEIR RESPECTIVE LAUNCHES, console hardware has always been a fair bit behind the times. It's made down to a cost that can be justified for a "toy". Other than a source of SOME titles for porting, the loss of consoles in general probably wouldn't even be felt other than butterfly effect ripples in the pond. That, and the non-existence of Atari and Nintendo (and later Sega), which would have removed a couple of major competitors and motivators from various other companies such as Commodore and IBM.
So yeah .... we wouldn't be devoid of computers, internet, or all the facilites we associate with them. And they wouldn't be stoneage either, unless we somehow ended up in a wierd alternate history where we're all using the super cheap descendents of the Sinclair QL (their image as a "serious" electronics manufacturer having been untarnished by Jet Set Willy, and Clive's death before he could get the C5 past prototype, so they could concentrate better on improving quality and properly securing the telecoms contracts) to dial in over modem lines to a huge central server where all the proper "hard work" is done. The forces that have driven the development of the amazingly powerful audiovisual multiprocessor supertransputer workstations we take for granted would still be in play, if diluted. A person from the late 90s in our timeline would probably feel right at home in the 2011 of that universe.
Who knows, without the distraction we may have advanced to far greater things
If you go there and find out, bring me back a robot.