PC Gaming: Should there be a freeze on technology and hardware development?

Recommended Videos

Agiel7

New member
Sep 5, 2008
184
0
0
How many people entering college now can claim to have been a hardcore PC gamer since the mid 90's? The era brought us classics like "Myst," "Return to Castle Wolfenstien," and "X-Com: UFO Defense." I remember being given my first actual video game outside of those educational software games, "Star Wars: Tie Fighter" back in '96 (I was seven back then). Since then, I've played the first two "Rainbow Six" games, the old Jane's combat simulations like "Longbow," "Advanced Tactical Fighters," and "F-15," the original "Half-Life," "Operation Flashpoint," "Starcraft," "Diablo 2," the first two "Fallout" games, "Mechwarrior 3," and "Homeworld." In all those years, not once did I have migraines and aneurysms because of CTD's and error messages, not once did I have to upgrade my computer (the machine I used for gaming had a 600 mhz intel processor, 128 mb ram, and a 4mb video card), hell, I was too young in those days to even know that even mattered.

Fast-foward to 2001 and the release of "Ghost Recon." When I bought it for my PC, my thinking was "Hey, its not a PS2 game, its a computer game, should work on my PC. Right?" Wrong, so imagine how perplexed I was when I recieved a "General Protection Fault" error screen when I double clicked the shortcut icon. It took an angry call to technical support to discover that my 4mb video card wasn't up to the task, so I grovelled at the feet of my parents to buy a new video card so I could get back to gaming. Back in those days, we were able to get a Geforce 2 for about 50 dollars on sale, since then, I've purchased 3 video cards (Geforce 4, Geforce 5600, and a Geforce 7600GT), 3 CPU and Motherboard combos (an 800mhz, 1.7 ghz, and 3.2 ghz), and upgraded my RAM 4 times until I finally got a gaming laptop as a high-school graduation present last summer.

Recent years have made me realize the reason why console gaming has begun to gain favor over PC gaming. The cost of hardware is starting to become nothing short of monolithic (for the cost of a top-of-the-line Falcon gaming PC, about $8000, you could get a second-hand 998cc Yamaha R-1 sport bike, or send your kid to a state university for a year). In addition, with increasingly complex game engines comes an increasing number of things that can go wrong with a game, from hardware incompatibility (excacerbated by the insane variety of hardware today) and processing conflicts, as a result, most PC games are buggy messes. Lets not forget the breakneck speed at which the technology progresses; by the time I bought a Geforce 7600GT, the 9 series of Nvidia cards were announced. Sometimes engine technology outpaces the actual hardware, for instance, "Crysis." For me, that game represents everything that is wrong with PC gaming today: an overly flashy engine with no class and finesse to back it up (with "Cryetek" being a German developer, I suppose Germany's game developers are just like their automobile companies :p) that laughs at even $5000 gaming desktops and more bugs than an anthill.

So my question is this: When PC hardware has left consoles in the dust in terms of technology, should PC developers stall on technology and hardware development for the sake of the area where console platforms excel (cost and functionality)?
 

NXMT

New member
Jan 29, 2009
138
0
0
You don't have to act on impulse and purchase the newest line of cards as soon as it hits the shelves. Also you shouldn't confuse advances in video cards with concurrent games development. The developers may or may not take full advantage of it. I spoke earlier in the PC wheat thread over the flexibility of PC gaming in general when it comes to tweaking so even the game itself provides you with more bells and whistles, you don't have to enable it in the first place.

I have a 256MB 9300GE which is not a gaming card but I can run modern games like COD4 and Fallout 3 just fine. Sure I can't get the glorious blinding bloom or shiny water but are such things a necessity? In fact, it is shoddy ports or programming that you should be worried about. Not the hardware.

I find it ironic that people cry over Cysis' over the top specs and cry at upgrading costs at the same time. You want the best? Be prepared to fork out the loot.
 

Agiel7

New member
Sep 5, 2008
184
0
0
Richard Groovy Pants said:
or the cost of a top-of-the-line Falcon gaming PC, about $8000,
This made me shoot Sunny Delight (now with more orange!) out of my nose.
A good top-end-of-the-line nowadays costs around 1250$~~.
Don't believe me? Check this out: http://reviews.cnet.com/desktops/falcon-northwest-mach-v/4505-3118_7-33370265.html

I'm not saying that because its for my personal gain, I'm saying this because this is one of the main reasons why PC developers have alienated gamers. These days, PC ports are almost afterthoughts compared to their console counterparts. Console games have suddenly grown in favor in the eyes of developers because PC games are too difficult to develop for due to the overwhelming amount of hardware they have to program for in order to support them.
 

Simriel

The Count of Monte Cristo
Dec 22, 2008
2,485
0
0
I agree. I recently bought a brand new gaming P.C (its about 8 months old now) and i have moved to console gaming since for the simple fact, my 'new' P.C dont run new games as well. I mean honestly. I shouldnt have to change my hardware every 3/4 of a year just to play new games. Its silly. In fact the only reason im gonna even USE my p.c for gaming in the next while is kotor 3, Diablo 3, and DOW2. Everything thats for multi platform, will be bought for my new Xbox (which came with three games and was cheaper than my P.C... MUCH cheaper and itll still be useful for another couple of years!)
 

theultimateend

New member
Nov 1, 2007
3,621
0
0
Richard Groovy Pants said:
or the cost of a top-of-the-line Falcon gaming PC, about $8000,
This made me shoot Sunny Delight (now with more orange!) out of my nose.
A good top-end-of-the-line nowadays costs around 1250$~~.
My PC's cost me roughly 800 dollars. Everytime so far they have lasted me something like 5 years. During that five years I 'might' make 200 dollars in upgrades total.

I think 1250 is if you are feeling rather insecure about your epeen :).

But I do appreciate the rest of your post. (You being The sunny delight guy).

Anyone who thinks gaming hardware should be halted needs to go back to school. There is no reason to slow the advancement of technology. It does nothing to hamper the creativity of game developers. Two of best games I own are still games you can play on a 6-7 year old system.

Namely XCom and Master of Orion II.

In fact when you take cost of development and place it next to quality. I can't name a game in the last 5 years (ok maybe no more than 5 games) that has had a reasonable level of quality compared to the cost of development. Amongst major developers.

Developers feel (erroneously) that they must make their games use every last inch of the PC, Crytek is a wonderful example of this fallacy, but frankly people will spend 20 hours playing bloon tower defense III on a flash website that was made from Paintbrush graphics.

People want quality of content far before they want quality of visuals. The latter is just the frosting on the well baked cake. The worlds best frosting can't make rancid dog shit acceptable (Especially once that frosting melts in your mouth and the shit is still there).

A good example is SPORE, once the frosting of visuals melts most people (since I'll admit some people love rancid dog shit), you are left with a very bad taste in your mouth.

Simriel said:
I agree. I recently bought a brand new gaming P.C (its about 8 months old now) and i have moved to console gaming since for the simple fact, my 'new' P.C dont run new games as well. I mean honestly. I shouldnt have to change my hardware every 3/4 of a year just to play new games. Its silly. In fact the only reason im gonna even USE my p.c for gaming in the next while is kotor 3, Diablo 3, and DOW2. Everything thats for multi platform, will be bought for my new Xbox (which came with three games and was cheaper than my P.C... MUCH cheaper and itll still be useful for another couple of years!)
I know this is going to sound childish as a response. But I don't believe you. Could you provide a spec sheet and the price of the PC and location of purchase? Because unless you got a 300 dollar EMachine i can't believe it.

My first PC out the door was 300 dollars (about 8 years ago) and I was able to upgrade it with small purchases over the next half decade so that it played everything that came out (mind you not at full graphics but I'm not an epeen whore either).

So I'm having a strong level of difficulty accepting that you got a brand new gaming (keyword gaming) pc and its outdated in 8 months.
 

Lobsterkid101

New member
Nov 10, 2008
75
0
0
The reason why Console games have been gaining a graphical ground on pc's is because the hardware is the same across all systems *hard drives do not count :p* This means that they can optimize the system without going through multiple api's *the software that allows the game to talk to different types of hardware without there being a sort of language barrier if you catch my drift*

Api's drastically slow down the processing speed, its like needing to have a translater to translate english into chinese for you then back again...its tedious. With Consoles, there's no need for that translator, as the hardware and the software are essentially speaking the same language.

So, i suggust, instead of "freezing" the pace of hardware development *which i think would be a step in the wrong direction, i mean, progress IS progress, why the hell would we want to stop that?* Instead, we could STANARDIZE gpu's and cpu's ect ect to an industry standard, much like the USB port and other similar devices.

Then since developers have the same type of archetecture, just more and less powerful versions of it, they can really optimize the hell out of their games pushing computer games far into the next NEXT generation...

Take crysis as an example, it takes a pc liquid cooled with holy water in order to run it on very high specs... However, if it were optimized specifically for that type of archetecture, it would take around *and i'm estimating here, but this is somehwere around the range* 1/5 to 1/6 the power to run it.
 

RufusMcLaser

New member
Mar 27, 2008
714
0
0
Count me with the crusty, fossilized old bastards who think bling-y graphics are killing gaming in general, and PC gaming in specific. Shamus Young, he of Stolen Pixels (and has to say [http://www.shamusyoung.com/twentysidedtale/?p=612].

Done?

I'm with him. If PC gaming got away from pushing the bleeding edge of graphics, and focused on things like, say, gameplay and accessibility and compatibility the world would be a beautiful place, with unicorns and free ice cream and rainbows and shit.
 

MercFox1

New member
Jun 19, 2008
131
0
0
"PC Gaming: Should there be a freeze on technology and hardware development?"

What. Of course no, hell no, why would you stall technological development for ANY reason? Ultimately, the game developers determine what level of hardware is required to run a game. You mention Crysis, but I counter with Call of Duty 4.

I've had Call of Duty 4 running on an ATI Radeon 9800 Pro from 2003, at 1152x864, with what I would consider a "Medium" level of detail. A 6 year old card, with a game that was 2007's Game Of The Year. This was without overclocking. It was speedy enough to play online, more than fast enough to complete the campaign without a hitch, and it was also used to play with my younger siblings over LAN.

The "need" to update every 3 months, 6 months, or 9 months (whatever ludicrous duration it changes to next) is fictional. A person with an 8600 GT, enough RAM (4GB of DDR 800 RAM is ridiculously cheap at the moment), and even a low to medium range dual core processor shouldn't have an issue with 'Mirror's Edge', which came out not even 3 weeks ago. As newer games come out, take the hit and lower the detail from high to medium. You want to play it, right? Even Mirror's Edge will look gorgeous on Medium.

Then when you do feel like upgrading (I did so only two weeks ago [from a 7900 GS to a GTX 260, which skipped the 8000 generation and the 9000 generation of nVidia cards]), it feels like a whole new game when you crank everything up!

EDIT: And, to note, small upgrades (like a 30 dollar RAM expansion back in November, when NewEgg had a sale) have kept my nearly-two-year-old-PC on the cutting edge.
 

theultimateend

New member
Nov 1, 2007
3,621
0
0
RufusMcLaser said:
Count me with the crusty, fossilized old bastards who think bling-y graphics are killing gaming in general, and PC gaming in specific. Shamus Young, he of Stolen Pixels (and has to say [http://www.shamusyoung.com/twentysidedtale/?p=612].

Done?

I'm with him. If PC gaming got away from pushing the bleeding edge of graphics, and focused on things like, say, gameplay and accessibility and compatibility the world would be a beautiful place, with unicorns and free ice cream and rainbows and shit.
2 posts up I said that too. So love me as well! >_> <_< :D.
 

NXMT

New member
Jan 29, 2009
138
0
0
RufusMcLaser said:
I'm with him. If PC gaming got away from pushing the bleeding edge of graphics, and focused on things like, say, gameplay and accessibility and compatibility the world would be a beautiful place, with unicorns and free ice cream and rainbows and shit.
The whole Goddamn industry needs a swift boot to the ass. Except for Nintendo I think. Everytime I think about bitching about graphic whores, I remind myself that Nintendo has reeled in much cash from their tiny machine with a touch screen that is best known for their 2D based games =P
 

Enigmers

New member
Dec 14, 2008
1,745
0
0
My computer from 2004 costed my family maybe $500, and I've recently upgraded it for about $300. AMD Athlon 5000+, 2 gigs RAM, nVidia GeForce 8600 GT... runs CS:S on 1280 x 1024 with pretty much all settings on high, same resolution for CoD4, but without AA... it's very playable and cheap.

Apart from Crysis I'm sure my computer can run anything I throw at it (unless of course I throw it so hard it ends up breaking some fragile bits.) There's honestly no need to slow down technology just because getting THE ABULOOT BEST THING EVAR OMG is totally unnecessary.
 

guardian001

New member
Oct 20, 2008
519
0
0
Agiel7 said:
Fast-foward to 2001 and the release of "Ghost Recon." When I bought it for my PC, my thinking was "Hey, its not a PS2 game, its a computer game, should work on my PC. Right?"
Your logic seems to have been incredibly backwards way back in 2001... Console games are guaranteed to run the same, because every PS2 is the same as every other PS2. PC games on the other hand, come with that long list of PC specifications telling you the minimum standard.

As for how expensive technology is, NOBODY will need a computer like that for gaming, at least not anyone within the next couple dozen years. Not even people playing Crysis games. Realistically, you will probably need a computer that costs about $2000 to run most games at max settings, or as low as $1000 for lower settings. Buying that Falcon PC would be like buying a car that came with a jet engine installed. Would it be awesome? Yes. Would you realistically ever need it/be able to use it? No. Do you need 12 Gigs of RAM, Dual Video cards, Every peripheral port know to man, and the fastest commercial processor? No. you just want it.

So no, Computer hardware shouldn't come to a screeching halt. If you don't want to upgrade, chances are you don't need to. Even if you do need to upgrade, it will probably be individual components, not the entire PC.