PC Gaming: Should there be a freeze on technology and hardware development?

Recommended Videos

DGMavn

New member
Jan 31, 2009
4
0
0
Agiel7 said:
So my question is this: When PC hardware has left consoles in the dust in terms of technology, should PC developers stall on technology and hardware development for the sake of the area where console platforms excel (cost and functionality)?
What do you mean by PC developers?

If you mean the people who write games for the PC platform, how do they affect cost and functionality of hardware? They can only affect the software they write, and in any economic system it would be unwise for them to NOT program to the limits of current systems' capabilities.

If you mean the resellers who put together the PCs like Dell and Falcon, build your own computer and stop paying premium prices.

If you mean the people who are making the hardware like NVidia, Intel, AMD and such, then you're asking that we stop technological development because "I'm poor and I want better games waaaaah." PCs are not just gaming consoles, and CPUs and GPUs have other uses other than playing Crysis. Folding@Home, anyone? Not to mention business and scientific applications.

In conclusion,

- Stop being poor
- Stop whining about being poor, and
- Buy a new GeForce card and cure cancer.
 

theultimateend

New member
Nov 1, 2007
3,621
0
0
Credit to the OP.

It's rare that I see this many people in agreement in a thread about a topic on the escapist.

Albeit its against the OP's views but still. Uniting a people ;).
 

ratix2

New member
Feb 6, 2008
453
0
0
hardware drives software development.

both the ps3 and 360 have apis, whoever said otherwise: you are an idiot. the 360 uses a directx based api that microsoft calls direct xbox and is somewhere in the middle of dx9 and dx10 (closer to dx9 though as the dx10 parts are mostly things to do with unified shader, but it does NOT support the geometry shader that dx10 does if in not mistaken). the ps3 on the other hand uses a custom propritary api that sony developed, it is based off of opengl 2.0. both systems also reserve resources for the system that can NEVER be used for a game. for the 360 its 32mb of ram and one thread. the ps3 reserves 96mb from system memory and 1 spe.

crysis runs WONDERFULLY for me on an 8800gt and an e6600 at all very high, it doesent require an LN2 cooled super computer. and if you actually give it a chance and play it its one of the best shooters to come out in a long time. but ill agree that this push to one up the last guy with graphics is killing the gaming industry.
 

Aries_Split

New member
May 12, 2008
2,097
0
0
Lobsterkid101 said:
This means that they can optimize the system without going through multiple api's *the software that allows the game to talk to different types of hardware without there being a sort of language barrier if you catch my drift*
Did you miss
Eggo said:
Please, oh, please just read this. Over. And Over. Again.

http://en.wikipedia.org/wiki/DirectX
http://en.wikipedia.org/wiki/DirectX
http://en.wikipedia.org/wiki/DirectX
http://en.wikipedia.org/wiki/DirectX
http://en.wikipedia.org/wiki/DirectX
http://en.wikipedia.org/wiki/DirectX
http://en.wikipedia.org/wiki/DirectX
http://en.wikipedia.org/wiki/DirectX
http://en.wikipedia.org/wiki/DirectX
 

Wyatt

New member
Feb 14, 2008
384
0
0
i been playing PC games for years now, the last consol i bought was the PS1 and i didnt own taht long before i got my first PC so i been around for a while. i can say that PCs are like cars, some people like to tinker with em and get top of the line performance out of em, some people just want something to 'drive to work' with and dont want to have to screw around with it a bunch. i fall somewhere in between, i dont enjoy screwing with my computer hardware or software, but i know enough to fix most issues, anyhow i say that no hardware shouldnt stand still, but as some have said here you dont NEED the tippy top best hardware money can buy and you damn sure shouldnt ever buy one of those 'gamers rigs' for that absurd price. i know ..... i did, once. i bought an alienware computer (2 years before the Dell buy-out) and what a waste of cash, cost me about $4K total for everything and aside from the fact it didnt work right out of the box, some moron forgot to plug the friggen optical drive into the right port at the factory and another moron must have missed it during one of those "14 quality checks" they braged about in their sales pitch. a total of 68 hrs on the phone with INDIAN tech support and 4 months of them sending my random parts that would 'be sure and fix it this time' i finaly paid the local geek squad to atleast find out what was wrong with the fucking thing, heh the guy just pluged the drive in the right port and didnt even charge me for it.

that wasnt the real issue though. thing is when i finaly got the computer runing it was TOO new. most games hadent cought up with the hardware yet so i was forever having to shut down one of the cores (it was when dual core was still new and the shit), and roll back drivers and a whole bunch of things and when i DID get a game to run ..... meh ... i seldom noticed any real difference in anything i played. i allready HAD a computer that was 'optimized' for all the games that were out and the new top of the line machine was bascialy wasted for the few years it took for the software to catch up to its specs.

moral of my story? i had just as good a computer from gateway for my first 3 machines as ultimatly i did with the alienware rig for 1/2 the cash, and that its MUCH better to buy a computer that is a 'generation' or two away from the top of the line. not only is it much cheaper its also gunna work much better for the current state of the art in software. PLUS there is allways the problem with getting stuck with hardware that turns out to be ultimatly shit. by the time my 'dual core' alienware was actualy broke in all the new systems were quad core, seems like dual core was a fluke, kinda like vista, and id have been smarter to wate 6 months or so and just went for the quads. the dual core 'generation' lasted about 20 minuts total and i was stuck with something that was neither fish nor goat.

never buy the stuff that costs the most, never by the newest, and for damn sure dont buy a PC if your not willing to tinker with it, just stick too being a consol tard and ride someone elses bus for the rest of your life.
 

mooncalf

<Insert Avatar Here>
Jul 3, 2008
1,164
0
0
Dammit, I haven't drank every kind of soda there is, but they keep releasing new sodas... They should stop releasing new soda so I can catch up on my drinking. :(

Old soda tastes better anyway. :(
 

KSarty

Senior Member
Aug 5, 2008
995
0
21
People always exaggerate on this stuff, and it annoys me to no end. I recently upgraded my video card, and only my video card for $200. Everything else in my PC is the same hardware I installed 3 years ago, and I can play Crysis on all high settings. My previous video card was an x1800 and it could still play Crysis on low-medium settings. If your computer cant handle a game on full settings, then lower your settings, it won't kill you to not have a PC that can burn through everything. If you seriously built a PC 8 months ago and it cant handle new games at all, then you bought shit hardware to begin with, that has nothing to do with the speed of improvement. Fact is building a top of the line rig should last you at least two years, probably longer if you can deal with lower settings for a while.

I relish in knowing that when the time comes for me to do a full system upgrade, it will be many times more powerful than my current rig, why would I want that process to slow down?

EDIT: Also your assumption that this sort of thing started in 2000 or so is wrong. I started playing games on the PC in '91 or '92, and I distinctly remember my father upgrading our system for Wing Commander 2. Also, when Wing Commander 3 came along in '94, we needed another upgrade because the game only came in CD version, and we didn't have a CD-ROM.
 

DirkGently

New member
Oct 22, 2008
966
0
0
Lobsterkid101 said:
The reason why Console games have been gaining a graphical ground on pc's is because the hardware is the same across all systems *hard drives do not count :p* This means that they can optimize the system without going through multiple api's *the software that allows the game to talk to different types of hardware without there being a sort of language barrier if you catch my drift*

Api's drastically slow down the processing speed, its like needing to have a translater to translate english into chinese for you then back again...its tedious. With Consoles, there's no need for that translator, as the hardware and the software are essentially speaking the same language.

So, i suggust, instead of "freezing" the pace of hardware development *which i think would be a step in the wrong direction, i mean, progress IS progress, why the hell would we want to stop that?* Instead, we could STANARDIZE gpu's and cpu's ect ect to an industry standard, much like the USB port and other similar devices.

Then since developers have the same type of archetecture, just more and less powerful versions of it, they can really optimize the hell out of their games pushing computer games far into the next NEXT generation...

Take crysis as an example, it takes a pc liquid cooled with holy water in order to run it on very high specs... However, if it were optimized specifically for that type of archetecture, it would take around *and i'm estimating here, but this is somehwere around the range* 1/5 to 1/6 the power to run it.
I'm quite irritated. I wrote this extremely wonderful post explaining why standardizing this particular subset of technology wouldn't do very well, as well as your wrong use of the term "API", and how extremely unfeasible standardizing the hardware would be, and how an API for the software end would be redundant because of DirectX. Fuck me in half, I hate accidentally hitting the refresh button. And also how it would not take lesser specs to make Crysis run on beautifully on lower end computers, because standardizing wouldn't magically make all graphics cards so much more powerful. I mean, I suppose in theory, considering many worlds what would be a top of the line graphics card to us would be a standard or even low end thing to them.

Of course, that's all 'maybes' and 'could be's'.

Uh, forcing technology to stagnate would be a completely idiotic idea. Getting a standard to actually happen would require picking one chipset over the other or getting another one made up entirely, which would extremely costly on top of replacing all the other graphics cards. Also, one would need to have emulation software or maintain an older card if they wanted to use things designed pre-standard.

The mention of a hardware standard is only because there already is a nice little software bit, and it's called DirectX.
 

Draygen

New member
Jan 7, 2009
152
0
0
Hell, I'll take a few shots at this absurd OP claim, since everyone else is doin' it!

Freezing technology is the most ass backwards and contrived concept ever. The very thought of stopping from developing new and better things simply confounds the human psyche. You'd be the caveman telling folks to quit making spears, because you still hadn't gotten the hang of the sharp rock thing.

On a more specific note, I got an "outdated" video card a couple months ago, for about 1/4th of its original price, and only because about two weeks earlier they came out with a better card for far more money. If you can deal with being just one step down from 'top-shelf', then you can get by pretty cheap usually.
 

Raven28256

New member
Sep 18, 2008
340
0
0
I agree with many of your points. The start of your story was a lot like my own. Back in the day, I was a happy PC gamer and never had trouble, and I never even knew much about upgrading. Then, eventually, games started kicking my PC's ass so I started to learn more about upgrades. I must also agree with you about the annoyances of upgrading. I'm in college now, I don't exactly have a lot of spare cash to throw around. My current PC is four years old and is getting really jaded, and I can't exactly afford to start throwing upgrades at it right now.

HOWEVER! I disagree that we should "freeze" technology. Let it keep advancing, we don't really need a lot of it until a few years later anyway. That said, what SHOULD be done is this:

Developers need to start focusing more on making "mid-range" games.

It seems that so many developers only care about pushing the power of the average PC gamer's rig. Too many developers focus too much on making graphics as realistic as possible, and other related features that would make the typical NASA supercomputer piss its panties. It seems that PC developers have this idea that gameplay doesn't matter so long as the graphics are amazing, so they pour 85% of the massive budget into graphics.

I want to see more games that focus less on graphics and more on gameplay. We need more releases like, say, Sins of a Solar Empire. The game had merely a moderate budget. The graphics weren't all that amazing, but the point is that you didn't need a really powerful rig to run it. Likewise, most of the effort was put into the actual gameplay. Graphically, it was far from the best game around at the time, but it was so polished in terms of gameplay.

More developers need to take this approach. Lower the graphics budget, focus more on gameplay. Not every PC game needs to push the most up-to-date technology to its limits. Having merely above average graphics is FINE if you focus on the gameplay! Besides, think of all the money they would save. It will be much easier to make a profit if the budget reserved merely for the graphics and physics engine wasn't enough to finance a small country's military like so many high profile releases.

Sadly, too many gamers have the "lol i not gunna by this teh graphxx suk" attitude, making this approach harder. I also imagine that Mr. Overcompensation and his $10,000 rig would be considering suicide over buyer's remorse...
 

Turtleboy1017

Likes Turtles
Nov 16, 2008
865
0
0
Hey, whether you like it or not, tech is going to keep on moving. Stop focusing so much on the crazy price on everything, and just find a balance on what you like. I have a Xbox 360, and a PC with decent RAM and CPU with a 9800 GT. Both of the above play games the way their supposed to, I have them both hooked up to the same 19-inch monitor and I play all my games at 1440x900 resolution. You don't HAVE to upgrade to the very best, just let NVIDIA and Radeon churn out their cards, and enjoy the stuff you have. Hell, Pac-man had shit graphics but I still play it because its FUN. Who cares how good their game looks as long as it does what it's supposed to. Let companies keep on producing their shit, and let the community continue to play on what they want to.
 

DGMavn

New member
Jan 31, 2009
4
0
0
Everyone who agrees with the OP is still poor.

You want gameplay? Grab an emulator and Battletoads and leave me to my Fallout 3 and my Team Fortress 2.

Nostalgia makes people look at the past with rose-tinted lenses. Just because the games were old doesn't mean they were any better gameplay-wise. For every X-Com: UFO Defense there were 20 shovelware games because games were smaller and those games were easier to produce. "Old games are better than new games" is a flawed assumption. So is "freezing hardware is good for the consumer."

I'm sorry, this is just the dumbest piece of tripe I've seen generated about games in a long while - and I read YouTube comments.
 

moley

New member
Jun 21, 2008
5
0
0
one fatal assumption you make is that all people only purchase new top of the line computers for playing games on.

by freezing hardware advancement you seriously cripple the capabilities of other areas of computing.
 

Dommyboy

New member
Jul 20, 2008
2,439
0
0
Agiel7 said:
Richard Groovy Pants said:
or the cost of a top-of-the-line Falcon gaming PC, about $8000,
This made me shoot Sunny Delight (now with more orange!) out of my nose.
A good top-end-of-the-line nowadays costs around 1250$~~.
Don't believe me? Check this out: http://reviews.cnet.com/desktops/falcon-northwest-mach-v/4505-3118_7-33370265.html
That computer is just one over priced overkill.

I have a $1300 computer and it can run everything fine I throw at it. There is no need to buy the top of the line of everything, developers will still make games for older computers because not everybody has a top of the line computer.
 

jamesworkshop

New member
Sep 3, 2008
2,683
0
0
PC gaming has gotten cheaper and more accessible than ever before whereas consoles have got progressivly more complex and expensive (PS3) a good examples are that DDR2 ram is about £15 for a gig and a £130 Radeon HD4850 can play farcry 2 on maximum Dx10 settings at 1920x1200 between 30-50FPS I can't think of any other time where a graphics card costing that little was able to play a modern game maxed out at a high resolution.
Hardware has become this cheap because of the greater number of manufactoring facilities and cost reduction methods like die shrinkages (45nm CPU's)
look at the prices of hard drives £85 can get you 1.5TB you can build extremly good PC's for very little money nowadays and that is directly due to the constant evolution of the hardware and a competitive market.
 

Squarewave

New member
Apr 30, 2008
229
0
0
Its unrealistic to expect a hardware freeze, what I would like to see is an ISO standard for a gaming computer. Some kind of standard for hardware that developers could target so that if your computer was compliant with the standard that the game would play properly, people with higher spec computers could get more effects and a faster framerate and all that, but people with the minnimum ISO spec would still play the game with a good framerate and presentation.

When I first read about the "games for windows" I was hoping MS would have some sort of requirement like it, but that never happend

The current problem with computer hardware is 2 fold in my mind that an ISO would solve

1) Computer manufactures selling computers with low end gpu's as gaming computers, like intel or via, or even low end nvidia gpus like the 7050

2) Developers targeting very high end systems, expecting quad cores, and top of the line nvidia cards in sli just to get 30 fps
 

Turtleboy1017

Likes Turtles
Nov 16, 2008
865
0
0
Games are almost having a hard time keeping up with new products it seems... Except for Crysis. I think.
 

Jandau

Smug Platypus
Dec 19, 2008
5,034
0
0
There already is a freeze on technology. Due to the move of focus to consoles very few games today will exceed console hardware. Every time a new console gen comes out, there's a brief growth period when it comes to graphics, but it dies down in a few months and for all practical purposes it pretty much stagnates until the next gen comes around.

Games for PC today are:
1. Console ports
2. Developed for PC and console simultaneously
3. Developed for PC with plans for a console port
4. Developed for PC, but without budget or desire to push the hardware limits

Very few games fall outside one of these 4 categories and the days when you need to upgrade your PC every year are long gone. You just need to stay slightly ahead of the current console generation.

Example: ME! I bought my PC a year ago. It cost me roughly 550$. Specs are:

Intel Core Duo 2.20 GHz
4GB RAM
NVIDIA GeForce 8600GT 256MB

This runs anything I've tried putting on it. It doesn't run at the highest possible settings, but runs on med-high (worst case scenario) and everything damn sure looks nicer than on a 360 (or a PS3 for that matter). Fallout 3, Crysis, any console port, etc. Heck, when I installed DoW2 beta it defaulted to High settings and purrs like a kitten.

Yeah, I'm sure you can find a "top-of-the-line" PC for 10k$, but what in the world would you need it for? My bucket cost me 550$ a year ago, it's even cheaper now, and it provides me with better hardware and performance than any console.

As for people buying a PC and having it turn useless in a few months: There's a difference between a crappy cheap-arse PC and a best buy PC.

So, to summarize a bit:
1. No further "freeze" on tech should be put in place. Consoles are enough of a speedbump.
2. PC gaming is no more expensive than consoles when it comes to hardware.
3. If you're buying a PC, don't be an idiot and do some research. It'll stop you from shooting yourself in the foot.
 

karn3

New member
Jun 11, 2008
114
0
0
Are you insane?! You can't stop technological progress, nor can i think of any reason you would want to. You don't need a top of the range gaming rig to play everything. For about £400 (about $750) you can build a machine that will play every game currently available with the shinyness turned on, save crysis obviously. Don't forget also the more progress is made the more the previous generations go down in price. Games don't often take advantage of the latest top of the range gear because they know their customers don't have access. My gaming rig cost me £600 about three years ago and i can still play all modern games on the highest graphics settings. I can even play crysis with most of the settings on high , in case you don't know there are ultra high settings for everything. This rig today costs under £400. All thanks to technological advancement pushing the prices of "older" technology down. If you went out to build a rig with the budget i initially went out with three years ago you can build a hell of a computer.