PC Gaming: Should there be a freeze on technology and hardware development?

Recommended Videos

Asehujiko

New member
Feb 25, 2008
2,119
0
0
Agiel7 said:
How many people entering college now can claim to have been a hardcore PC gamer since the mid 90's? The era brought us classics like "Myst," "Return to Castle Wolfenstien," and "X-Com: UFO Defense." I remember being given my first actual video game outside of those educational software games, "Star Wars: Tie Fighter" back in '96 (I was seven back then). Since then, I've played the first two "Rainbow Six" games, the old Jane's combat simulations like "Longbow," "Advanced Tactical Fighters," and "F-15," the original "Half-Life," "Operation Flashpoint," "Starcraft," "Diablo 2," the first two "Fallout" games, "Mechwarrior 3," and "Homeworld." In all those years, not once did I have migraines and aneurysms because of CTD's and error messages, not once did I have to upgrade my computer (the machine I used for gaming had a 600 mhz intel processor, 128 mb ram, and a 4mb video card), hell, I was too young in those days to even know that even mattered.

Fast-foward to 2001 and the release of "Ghost Recon." When I bought it for my PC, my thinking was "Hey, its not a PS2 game, its a computer game, should work on my PC. Right?" Wrong, so imagine how perplexed I was when I recieved a "General Protection Fault" error screen when I double clicked the shortcut icon. It took an angry call to technical support to discover that my 4mb video card wasn't up to the task, so I grovelled at the feet of my parents to buy a new video card so I could get back to gaming. Back in those days, we were able to get a Geforce 2 for about 50 dollars on sale, since then, I've purchased 3 video cards (Geforce 4, Geforce 5600, and a Geforce 7600GT), 3 CPU and Motherboard combos (an 800mhz, 1.7 ghz, and 3.2 ghz), and upgraded my RAM 4 times until I finally got a gaming laptop as a high-school graduation present last summer.

Recent years have made me realize the reason why console gaming has begun to gain favor over PC gaming. The cost of hardware is starting to become nothing short of monolithic (for the cost of a top-of-the-line Falcon gaming PC, about $8000, you could get a second-hand 998cc Yamaha R-1 sport bike, or send your kid to a state university for a year). In addition, with increasingly complex game engines comes an increasing number of things that can go wrong with a game, from hardware incompatibility (excacerbated by the insane variety of hardware today) and processing conflicts, as a result, most PC games are buggy messes. Lets not forget the breakneck speed at which the technology progresses; by the time I bought a Geforce 7600GT, the 9 series of Nvidia cards were announced. Sometimes engine technology outpaces the actual hardware, for instance, "Crysis." For me, that game represents everything that is wrong with PC gaming today: an overly flashy engine with no class and finesse to back it up (with "Cryetek" being a German developer, I suppose Germany's game developers are just like their automobile companies :p) that laughs at even $5000 gaming desktops and more bugs than an anthill.

So my question is this: When PC hardware has left consoles in the dust in terms of technology, should PC developers stall on technology and hardware development for the sake of the area where console platforms excel (cost and functionality)?
You can get that Falcon for a 10th of the price if you build it yourself instead of buying it premade. I'm currently looking at buying something of similar quality for $950, waiting for the post x-mas price drop so i can get it below $800.

Then there's the Crysis fallacy. No matter how much crytek keeps yapping on about how it's supposed to be a benchmark for pc's, IT ISN'T. It's badly optimised is most places and not at all in some and the higher graphics are riddled with memory leaks which makes it IMPOSSIBLE to run them on ANYTHING to run them for extended periods of time without community fixes, not even university grade supercomputers(we tried and it grinded to a halt in less then 6 hours with the full). Once you do get the community fixes, it runs at 35fps on a 2.8ghz single core with 1gb ram and a 6800gtx.

And no, progress should not be stalled simply because you feel the need to run everything on max and get new parts the moment something slows down a tiny ammount. Use medium settings and update your drivers.

Also, look at the system requirements on the back of the box before buying.
 

Bloodbane15

New member
Jan 31, 2009
14
0
0
For years my computer was upgraded from the scrap lying around at my dads work (photocopier tech) or given him from his customers throwaways. It worked, it played all the games i wanted to play. I ended up with a 2.4ghz, 512MB ram, inbuilt gpu before i finally upgraded with some store bought hardware.

It cost me about $450 which is less than an Xbox360 and way less than a PS3 to get an AMD Phenom 9500 (2.2 Quad core) $199, 2GB DDR2 Ram $60, a 512MB 8600GT $90 and an Asus AM3 Motherboard $100.

I can run Crysis on high graphics (DX9) and have had absolutely zero problems with any other game, if thats not good enough for you then i say, go run off and join the other countless millions of console fanboys the PC community doesnt need your type.
 

EzraPound

New member
Jan 26, 2008
1,763
0
0
Generally speaking, hardware that's not in complete flux favours game development (think of the milage extracted from the SNES, for example, then compare it to all the CD-based systems like the CD-i and 3DO that sold due to the introduction of disc technology), but it's not as if you can just issue a decree to stop it.
 

DGMavn

New member
Jan 31, 2009
4
0
0
Jandau said:
There already is a freeze on technology.
No there isn't. I'm pretty sure NVidia, Intel and AMD are still trying to make better hardware, and that game writers are still pushing the boundaries of consumer-grade technologies. You're pretty wrong.

Squarewave said:
Its unrealistic to expect a hardware freeze, what I would like to see is an ISO standard for a gaming computer.
First of all, ISO does not and never will give a fuck about gamers.

Second of all, if you're not updating this standard constantly (read: quarterly) to keep up with the grade of technology that's being produced, then you're doing gaming a disservice by forcing developers to peddle to the lowest common denominator in terms of computing power. Also, then you're going to have minimum specs change during the development lifecycle of a game. I start developing a game in '09. Since it's my flagship IP, the development takes 3 years. (My company's like Valve, except we don't leak source.) At the end of the development cycle, we have a game that's programmed to run on 3-year-old bargain computers. How is that good for anyone, be it the consumer or the manufacturer?

Stop acting like graphics quality is the devil. Graphics quality improves the immersion of a game. And every time a Crysis-level game comes out, people complain about how they need to upgrade their machines to run it and then a year later, a $500 machine can run it on at least passable settings.

This entire thread fails.
 

Jandau

Smug Platypus
Dec 19, 2008
5,034
0
0
DGMavn said:
Jandau said:
There already is a freeze on technology.
No there isn't. I'm pretty sure NVidia, Intel and AMD are still trying to make better hardware, and that game writers are still pushing the boundaries of consumer-grade technologies. You're pretty wrong.
You're acting like a jackarse, you know that?

First, way to quote one sentence out of context.

Second, if you bothered to read the rest of my post, then MAYBE you'd see what I meant by that. Go, read it, I'll wait...

Assuming you read the rest of my post (and not just the first sentence), you would have figured out what I quite plainly said. Consoles slow down the progression of hardware requirements, essentially putting a "freeze" on the hardware NEEDED to play games.

We are all well aware that new cards keep coming out. However, new games that demand those cards are NOT coming out and hardware over a year old (even more) can run games in high settings.

Now, if you managed to wrap your brain around what I actually said (instead of what you imagined), maybe you can reply with something actually relevant to my previous post?
 

Calax

New member
Jan 16, 2009
429
0
0
I might have agreed with the OP... Back in the early 90's when a computers speed and other benchmarks doubled every, what, six months? Hardware advancement has slowed significantly since then because of technical limitations that the developers of our hardware are fighting against. Things like Duel Core and Quad Cores are nice but most games aren't exactly desgined for them (the new ones work fine with those, but you try to play anything from before '06 and you'll have problems because the programs don't know how to deal with the extra processors)

I think PC's will always have a home with gaming. It's just that the consoles are easier to market to families to keep the kids in check. While a computer can take upwards of 500$'s for somthing that's decent, the highest priced console is 600... normally they run at about 250$. :Shrugs: I think PC gaming will come back. They always come back for more.
 

Simriel

The Count of Monte Cristo
Dec 22, 2008
2,485
0
0
Codgo said:
Simriel said:
I agree. I recently bought a brand new gaming P.C (its about 8 months old now) and i have moved to console gaming since for the simple fact, my 'new' P.C dont run new games as well. I mean honestly. I shouldnt have to change my hardware every 3/4 of a year just to play new games. Its silly. In fact the only reason im gonna even USE my p.c for gaming in the next while is kotor 3, Diablo 3, and DOW2. Everything thats for multi platform, will be bought for my new Xbox (which came with three games and was cheaper than my P.C... MUCH cheaper and itll still be useful for another couple of years!)
You must have got a piece of junk Hell PC if it only lasted 8 months.
I mean it now dont run high end games at full spec.
 

Bob_Bobbington

Senior Member
Oct 27, 2008
645
0
21
I only upgraded my PC for an actual gaming PC about 2 years ago. Before I did my old PC which was nothing special at all ran all recently released game fairly well even if only on low or medium. IT LASTED FIVE OR SIX YEARS. The only reason I upgraded was because it was starting to get slugish and I wanted to get Crysis. My new 'gaming' PC cost about $2000 Aus.

The fact that development on tech should stop is a very simple minded idea. If you buy half decent stuff to begin with, a computer should last 4 to 5 years.
 

Sewblon

New member
Nov 5, 2008
3,107
0
0
If they don't keep developing new technology, games won't get any better. Technological advancement is the lifeblood of the industry. You can save 2000 dollars if you build the computer yourself, a computer in the process of assembly by my dad and I is sitting behind me right now.
 

shadow skill

New member
Oct 12, 2007
2,850
0
0
The best thing they can do is get rid of the stupid dependency on a general purpose operating system. They should make an OS designed with gaming in mind and make it freely available. System requirements would go down, while what could be done in a game would go up, thus preventing all of this insane upgrading. While upgrading would still happen of course it would be at a somewhat slower pace since developers could get much more out of a system with the minimum specs.
 

shadow skill

New member
Oct 12, 2007
2,850
0
0
Eggo said:
But you aren't actually going to gain much by going to a "lighter" OS. Modern Windows OS's are remarkably adept at being flexible enough to provide oodles upon oodles of power to games.
Modern consoles say hi. Honestly do you think they would be able to do as much as they do in fact do if they had to support the kind of OSes people typically use for other more generalized tasks? The consoles are running on what would be the equivalent of a shoe string and bubble gum if we were talking PC components for the most part. I also cannot count how many times I have come across a game that I supposedly met the minimum requirements for only to have it run like absolute molases.

It's not just about lighter memory footprint, it's about tuning the system to the specific needs that games have. Of course there are other theoretical side benefits like ending one particular company's defacto domination of the PC game market. Bullshit stunts like Halo 2 being Vista only, or Crysis having settings only enabled for Vista that would indeed work on XP. People should not have to pay for an OS just to play a game when they already have the hardware needed to play the game in question.

The only time I use Windows on my machines is to play games (With the exeption of quake IV since that works under Linux natively.)the kind of work I like to do on my machine does not require Windows at all. I would love to see the day that all I needed to do was worry about having the appropriate hardware in order to play a game instead of worrying every five or six years about buying a new OS when the one I use 95% of the time works just fine!
 

kingcom

New member
Jan 14, 2009
867
0
0
Im stunned, you do realise consoles require hardware? So if we stop updating PC hardware, we stop updating console hardware. This is mind-bogglingly short sighted of you. Technology updates and onyl components get cheaper. 4G of RAM is near worthless no a days, and very few games will have you running anything that high.
 

Jandau

Smug Platypus
Dec 19, 2008
5,034
0
0
Simriel said:
Codgo said:
Simriel said:
I agree. I recently bought a brand new gaming P.C (its about 8 months old now) and i have moved to console gaming since for the simple fact, my 'new' P.C dont run new games as well. I mean honestly. I shouldnt have to change my hardware every 3/4 of a year just to play new games. Its silly. In fact the only reason im gonna even USE my p.c for gaming in the next while is kotor 3, Diablo 3, and DOW2. Everything thats for multi platform, will be bought for my new Xbox (which came with three games and was cheaper than my P.C... MUCH cheaper and itll still be useful for another couple of years!)
You must have got a piece of junk Hell PC if it only lasted 8 months.
I mean it now dont run high end games at full spec.
Neither do the consoles...
 

veloper

New member
Jan 20, 2009
4,597
0
0
There have been many technology slowdowns in the past.

I still remember how the geforce TI4200 remained a decent gpu for years. The radeon 9700 also had a long life. We've only just recently passed the multithreading barrier where a fast single core CPU was all you needed.

For years and to this day we are still stuck at the 2GB VAS barrier, because games need to run on windowsXP to sell. Games that would use more than 2 GB shall crash immediately. At this moment upgrading a gaming PC is rather cheap.
Memory is cheap. Core2s have become cheap and even the latest ati phenoms and X2s have good bang for buck. Mainstream Radeon hd4xxx have become fast and cheap.
 

Faeanor

New member
Dec 15, 2007
160
0
0
shadow skill said:
I also cannot count how many times I have come across a game that I supposedly met the minimum requirements for only to have it run like absolute molases.
Meeting the minimum requirements is very different than meeting the recommended requirements. You can't expect a computer that just barely makes it over the minimum to make the game look gorgeous and still get good framerates (if it can even make it look good at all).
 

Jumplion

New member
Mar 10, 2008
7,873
0
0
While the OP clearly hasn't met the likes of Eggo and Richard GRoovy Pants until now, he has a point albiet slightly.

Personally, I think the PC upgrades too fast for it's own good. There's G58MegaCore24724GraphixChip out on the market, but then a month later it's a G58MegaCore24724GraphixChip-1 and it's hardly supported or developed for. So then you get a huge convoluted choice of parts for your PC if you're building a PC and you don't know what is the "best" choice or what will "maximize" your gaming performance.

Now, I'm not saying that PC should entirely stop upgrading, but maybe slow down a bit so people can actually have a chance to catch up, or maybe to maximize the juice you get out of a product until you turn out the next orange to squeeze.

Then again, you'd get a console ;P
 

meece

New member
Apr 15, 2008
239
0
0
I really fail to see why people hate crysis so much. The thing about games: yes maybe you need some crazy super beast of a computer to run them on max settings....but you don't *need* a crazy super beast to run them.

My comp's ~3 or 4 years old now and cost ~1.2k new and it came pre-assembled..... now the thing is I can play all modern games fine, maybe not on max but then again I don't play games to try to spot individual pixels and have photorealistic effects. I play for the gameplay and have no issues with this and expect another year or 2 maybe more before I actually cannot play games anywmore ..... 1k over 4-5+ years isn't exactly bad especially when I could probably have built it myself for 3/4s or less of the price.

Oh and windows does have some incredibly tools to deal with the vast types of hardware out there. it's mainly not the devs issue to deal with compatibility it's microsofts. Mainly. So using it as an excuse at to why console's are popular is mostly wrong.