Veylon said:
Video games are probably the primary driver of computer technology. Outside of them, what possible use is there for 4GB of memory, a DSL line, and a quad-core to be owned by an individual?
I am sorry, but this is atrociously ignorant. It may hit me more than most because my job has to do with high-performance computing, but...
- Any kind of engineering modeling (vast majority of which is civilian) is heavily dependent on computational power. Huge (thousands of CPUs) clusters go into most nuclear reactor simulations, for instance.
- Data management, and networking in general, was developed partly by DARPA (for military applications), partly by CERN (for data transfer and processing in their experiments), and then by everyone and their mother, really. Businesses were interested in replacing (increasingly more expensive, due to rising costs of benefits) secretaries with networked computers.
- Vast majority of people do NOT own computers for the purpose of gaming. In fact, that's a pretty small minority. Consoles, that's different, obviously. But most personal computers out there (whether personally owned, or owned by companies) are (now) primarily for networking and document editing (i.e., typewriters).
Now, what definitely WOULD be different, is video cards and sound cards. Those are the two components that are primarily for entertainment (gaming). But CPUs, RAM, chipsets, HD speed, networking - none of these things were driven by video games. Computational modeling, data management, document editing and typesetting, computer graphics - these are the things that drove the industry forward. Defense funding certainly played a role as well (initially with pretty much all of computing, since the original purpose was engineering modeling, and later, primarily, with networking), although not as large as they sometimes claim to.