On the poor state PC games are in on release

Recommended Videos

Owyn_Merrilin

New member
May 22, 2010
7,370
0
0
Pjotr84 said:
Owyn_Merrilin said:
Pjotr84 said:
JohnnyDelRay said:
It's another reason why some developers, as well as publishers, support always-on DRM. The Willits guy from id said so himself http://www.escapistmagazine.com/news/view/112187-id-Software-Praises-Always-On-in-Diablo-3
You think so? The piracy angle is much more important in these draconian measures, I reckon. Non-forced patching through a launcher is something we can all - not just the selling parties - benefit from.
It's never really been about piracy; it's about maintaining control of the product, and pushing the idea that software is a service, not a product. If you'll notice, piracy is alive and well on the PC, but the used market, on the other hand...
The whole licensing idea behind games is something I hadn't thought of, but piracy prevention is very much a part of keeping control of your product.
It is to an extent, but it's something they can't really do much about. It makes a beautiful smokescreen for attacking their customer's rights as consumers, though; if it weren't for the excuse of "we're cracking down on piracy," they would have never gotten away with killing the used market, nor would they have gotten away with quite a few other things they've pulled.
 

Rawne1980

New member
Jul 29, 2011
4,144
0
0
Of course it's just an excuse.

If people refused to buy products that were unplayable on release then I can almost 100% promise you that the next game they release will work near enough perfectly.

Companies aren't to blame for the shit state of console to PC port, we are. We still buy them so they still make their money.

They release a game that has issues and like good cattle we run down and buy it and then complain on forums like it matters when we all know it doesn't. Yes i'm mocking myself because I do it aswell.

Although now i've taken to waiting a couple of weeks to buy new releases for my PC, that way most of the larger bugs have been squashed.
 

Pjotr84

New member
Oct 22, 2009
132
0
0
Rawne1980 said:
Companies aren't to blame for the shit state of console to PC port, we are. We still buy them so they still make their money.

They release a game that has issues and like good cattle we run down and buy it and then complain on forums like it matters when we all know it doesn't. Yes i'm mocking myself because I do it aswell.

Although now i've taken to waiting a couple of weeks to buy new releases for my PC, that way most of the larger bugs have been squashed.
It's like I hear myself talking. Although I didn't buy many games at release to begin with, I have now completely stopped doing so, as well. It's just not worth the risk and saves me quite a lot of money.

@Owyn_Merrilin

I can certainly agree the piracy card is a means to an end.
 

oplinger

New member
Sep 2, 2010
1,721
0
0
Pjotr84 said:
Lately there have been several PC releases that were in quite the shoddy state (Rage, Deus Ex: HR, etc) at release. According to many reviews' comments and forum posts this is due to the nigh infinite number of combinations of components in computers.

Although this sort of reasoning is generally accepted, I beg to differ.. for two reasons:

1. Back in the day games were more stable. Sure, they may not have been as complex, but the diversity in systems was likely equal to the present situation.

2. The variety in parts from one PC to the next is not that great. Since the hard drive, ram and motherboard don't really matter in this respect, the diversity would be amoung the CPU, graphics card and sound cards. Nowadays everyone has either an AMD or Intel processor and an AMD or Nvidia graphics card. Although there are a lot of different processors, the base architecture all boils down to the same. The fact that Nvidia and AMD release just one driver for all their current cards must mean the same goes for graphics cards. How the situation is for sound cards nowadays, I don't really know, so I'm leaving that as a possible cause for our problem.

What's your take on this? Is it just an excuse or is it a valid argument?
You're focusing on hardware. You're forgetting software.

They don't release just one driver for everything, they release a driver package, and it installs the driver for the detected card. The driver is sometimes very shoddily done with some of the lower power cards, as it's jsut the same driver as the high power cards, with parts disabled. (for example, AMD video cards have Overdrive on high end cards, but some low end cards can't access it.) That can cause problems because it means your driver can handle a certain function, but the card cannot (crashes) or the card can, but the driver has no idea what's going on (odd textures, screen tearing) and sometimes they have a work around coded into the low end cards, which causes them to process more (texture pop-in)

That's just drivers. There's also the OS, not everyone has every update for the OS, so that can cause countless bugs, as it means every OS can be slightly different in some important ways like .NET framework, and some security fixes change how the OS handles DX and other softwares.

Then we have to account for third party softwares in the background. Anti-virus, Steam, your driver front end, a media player if you listen to music, automatic updates in windows, Xfire, anything that runs in the background could cause a potential conflict, or issue.

So while the hardware may be more or less the same these days, and not radically different by say..missing instruction sets, or something insane, the software can be just different enough to cause problems. It doesn't even have to be anything major. No amount of testing will make that go away. Hardware was much easier to compensate for...software is not.

While I think they should test a little more maybe, finding some issues is impossible.

I think we should applaud the developers who actually fix their bugs. Some don't. They just throw it out there and hope for the best. When it doesn't work, they just move on and ignore everyone.
 

JohnnyDelRay

New member
Jul 29, 2010
1,322
0
0
Pjotr84 said:
JohnnyDelRay said:
It's another reason why some developers, as well as publishers, support always-on DRM. The Willits guy from id said so himself http://www.escapistmagazine.com/news/view/112187-id-Software-Praises-Always-On-in-Diablo-3
You think so? The piracy angle is much more important in these draconian measures, I reckon. Non-forced patching through a launcher is something we can all - not just the selling parties - benefit from.
Yes, I do think so, especially when they are pretty much announcing that they advocate this themselves. Voluntary patching is a much better option, but doesn't excuse anything half-baked really. A "patch" is something you put on a finished product, to fix it. So what it comes down to is the state of the game when it comes out - you can't expect every little thing to be tried by testers, it's nearly impossible. But the game should be able to be completed, at the very least, before you call it "finished".

However, I'm not going to make a big deal out of this, because I haven't actually had that many problems with bugs as of late, and due to tinkering with mods and console commands since the 20th century I've usually (taps wood) been able to work my way around bugs one way or another.
 

Pjotr84

New member
Oct 22, 2009
132
0
0
@oplinger

You certainly bring up a valid point, which accounts for a lot of the differences between systems today. My grief was with the hardware differences argument in particular, but it could have well included the software portion.

However, I do feel developers shouldn't have to compensate for user laziness, i.e. not updating. It certainly adds to the diversity - very much so, in fact - but in my view it should be ignored because of the fact it's easily fixed on the user end.

Also, third party programs like Xfire, MSN Messenger, AVG, Zonealarm and the like should be in a group which is always tested for compatability. I frankly can't imagine that wouldn't be the case.

@JohnnyDelRay

Thinking something and saying something can be two very different things, especially in these situations. I do agree that patches are there to fix things. Therefor they might as well be mandatory, were it not for the fact patches also introduce new bugs from time to time (New Vegas and Ninja Gaiden 2 come to mind in particular).
 

Elsarild

New member
Oct 26, 2009
343
0
0
No, it isn't a valid argument at all.

You forget to take into account the massive leaps and bounds in the technology, to point out a few: PhysX, CUDA, I technology for processors, mechinacal HDD vs. electric. to name a few, all these have massively diffrent impacts on games, and the developers have to adhere by this standard, and then some people will be left behind and call the game bad (see Batman Arkham Asylum, where if you didn't have a Nvidia card, no AA for you and no PhysX)

Also, a game "back in the day" had bugs as well, a "back in the day" game for me is the fallout 1+2, both where riddled with bugs that ended up getting fixed by the community in fan made patches.

even further back, the code was simpler, smaller, and games where more restrictive.

And last argument, back in the day, patches was a legend, if a company ever rolled one out, it would only hit a very small group of gamers, those who had internet, so a game had to be tested in every which way before they got released because mistakes on the same scale some modern day games get shipped with, would be a grave mistake and certainly lead a new company right into an early grave.

TL:DR games are much more complex, more technologies, and they can take advantage that almost every platform can recive a patch if needed.
 

Pjotr84

New member
Oct 22, 2009
132
0
0
So you're actually saying it is a valid argument? As for the Arkham Asylum scenario, that is up to the developer and implemented by design. CPU technolgies are either there or not there on the user end and are really the users' responsibility.

Before and in the early days of the internet there were of course games which were riddled with bugs, but it's gotten worse with the years, although the variety in configurations, as we established, dropped - at least in hardware.
 

oplinger

New member
Sep 2, 2010
1,721
0
0
Pjotr84 said:
@oplinger

You certainly bring up a valid point, which accounts for a lot of the differences between systems today. My grief was with the hardware differences argument in particular, but it could have well included the software portion.

However, I do feel developers shouldn't have to compensate for user laziness, i.e. not updating. It certainly adds to the diversity - very much so, in fact - but in my view it should be ignored because of the fact it's easily fixed on the user end.

Also, third party programs like Xfire, MSN Messenger, AVG, Zonealarm and the like should be in a group which is always tested for compatability. I frankly can't imagine that wouldn't be the case.
They don't compensate for user laziness, hence the bugs, they usually tell them to update everything first during tech support. Or in some cases the update isn't out when they thought it would be, so they make a work around in a patch. Or the update makes it worse sometimes..so they tell you to downgrade your drivers until a patch comes out.

As for third party, they may test for those, but maybe not test for those in any combination they can imagine. The one I generally have a problem with is Steam and Xfire, it causes many games to just...not run, it'll crash on startup (Dead Island did this to me, as did Spellforce) Hell, until recently Xfire was the leading cause of issues in games for me, Morrowind won't run with Xfire running, FEAR 2 runs at 20 FPS with Xfire up. Both took some tweaking to get to work properly. They test for the largest chunk of the community, but the smaller portion of the community us a lot louder >.> so it seems like they have no QA.

Hell, Cryostasis crashed left and right with ATI cards because it was a game relying heavily on PhysX's use on the GPU, rather than the CPU. ATI hadn't even heard of the game before..

Making games is a mess. >.>
 

Pjotr84

New member
Oct 22, 2009
132
0
0
Bugs because of people not updating their OS and drivers I can't really call bugs. Or course, no one can prevent bugs because of these updates except for patches.

As for Xfire, that program is step 1 in troubleshooting gaming problems when it comes to third party programs. I'm glad I'm not using it. It's a shame the old GfWL forum is no longer up, but you should have seen how many times Xfire came up there when fixing problems.
 

peruvianskys

New member
Jun 8, 2011
577
0
0
I guess I don't mind it; it's like someone is building you a guitar or something, and they say "You can have it with three strings now and I'll fix it by next week, or you can just wait until two weeks from now." Unless the bugs render the game completely worthless, it's fine by me to have them in there as long as I have assurance that they'll be patched. Like Red Orchestra, I bought it knowing that I wouldn't be able to fully enjoy it for about a month while stuff got worked out. As long as I can have some fun with it while they work out the bugs, and as long as I'm fairly sure it'll end up a good game after patching, I'm fine to get a dodgy copy a little earlier.

Developers definitely need to strike a balance though; I don't want an alpha build a year early just as much as I don't want a perfect one a year behind schedule. It's all about finding that mix of timely and stable, and I think Bethesda/Bioware do that just fine.
 

dantoddd

New member
Sep 18, 2009
272
0
0
Pjotr84 said:
Lately there have been several PC releases that were in quite the shoddy state (Rage, Deus Ex: HR, etc) at release. According to many reviews' comments and forum posts this is due to the nigh infinite number of combinations of components in computers.

Although this sort of reasoning is generally accepted, I beg to differ.. for two reasons:

1. Back in the day games were more stable. Sure, they may not have been as complex, but the diversity in systems was likely equal to the present situation.

2. The variety in parts from one PC to the next is not that great. Since the hard drive, ram and motherboard don't really matter in this respect, the diversity would be amoung the CPU, graphics card and sound cards. Nowadays everyone has either an AMD or Intel processor and an AMD or Nvidia graphics card. Although there are a lot of different processors, the base architecture all boils down to the same. The fact that Nvidia and AMD release just one driver for all their current cards must mean the same goes for graphics cards. How the situation is for sound cards nowadays, I don't really know, so I'm leaving that as a possible cause for our problem.

What's your take on this? Is it just an excuse or is it a valid argument?
To be fair all the rage problems might have something to do with them going for OpenGL over direct X.
 

Souplex

Souplex Killsplosion Awesomegasm
Jul 29, 2008
10,312
0
0
There are two main reasons for this:
The internet age: Developers can patch games at their leisure, but patching requires man hours, man hours require pay, and pay requires a budget. Pushing out the game early means that you make your profits earlier in, and the players function as Beta testers.

The state of PC gaming in general: There stopped being much money in the PC versions of games after 05. Blame it on piracy, PC players shifting to consoles, or some combo of the two, the PC version of a game is usually not the version the developers put most of their effort into. That combined with the above means that PC games are released in a buggy unfinished state.

The one real reason: Capitalism. It has its goods and its bads. Deal with it.
 

Rblade

New member
Mar 1, 2010
497
0
0
PC developers should take a page out of blizzards book. Think what you want about the company and their games but they are a clear example that if you make sure you really finish your game AND make good/original games to begin with you can be succesfull eventhough you push your deadlines back for months or even years
 

Elsarild

New member
Oct 26, 2009
343
0
0
Pjotr84 said:
So you're actually saying it is a valid argument? As for the Arkham Asylum scenario, that is up to the developer and implemented by design. CPU technolgies are either there or not there on the user end and are really the users' responsibility.

Before and in the early days of the internet there were of course games which were riddled with bugs, but it's gotten worse with the years, although the variety in configurations, as we established, dropped - at least in hardware.
No, it really hasen't gotten worse.

Fallout 2 shipped with far over 1000 bugs, thats even after the first offical patch hit, the latest fan made patch fixed 800+ additional bugs.

The bugs was, together with the code, just simpler and less obvious to all, some of Fallout 2's bugs was very special like leaving an area from a special exit grid while wearing the wrong armor. And back in the day when gaming in general wasen't so promonent, these would often go either unoticed for a long time, or didn't merit a change.

Today we have far fewer bugs, some bugs may be more destructive, but the sheer number of bugs has fallen alot, partly because of beta/alfa tests, either inhouse or when the company hires someone like Combat Testing to test the games for them.

Gaming has during the last 2 decades jumped from basement dweeling to child entertainment to spanning every nation on earth and every age group, with that, more money, and now it is actually worth for a studio to fix problems than just leaving them, like so many of the games back in the day ended up.
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
Lilani said:
I think you're forgetting one, HUGE difference between the PC games of the past and the PC games of today: Today, games can be patched and updated long after release. And I mean LOOONG after release. Before digital distribution and downloadable updates PC games were the same as consoles in that whatever product was produced was the final thing.
We've had downloadable updates since the 90s, though. I remember having to sit through them on a crappy 40-somethingKbps connection back in the day.

I don't think this is true at all. Back "in the day," I played tons of broken games on both console and PC, before and after patching became possible. The only difference seems to be that a broken game can become good, not that they exist. I'm not even sure they exist in larger numbers; I think the hype and obsession over "AAA" titles has led to the expectation of greatness from the gaming community just because there's a lot of money going into it and a lot of hype orbiting it.
 

JesterRaiin

New member
Apr 14, 2009
2,286
0
0
Pjotr84 said:
1. Back in the day games were more stable. Sure, they may not have been as complex, but the diversity in systems was likely equal to the present situation.
No. Bugs, glitches and patches are pretty ancient things. They evolved, but they were always there.

Pjotr84 said:
2. The variety in parts from one PC to the next is not that great. Since the hard drive, ram and motherboard don't really matter in this respect, the diversity would be amoung the CPU, graphics card and sound cards. Nowadays everyone has either an AMD or Intel processor and an AMD or Nvidia graphics card. Although there are a lot of different processors, the base architecture all boils down to the same. The fact that Nvidia and AMD release just one driver for all their current cards must mean the same goes for graphics cards. How the situation is for sound cards nowadays, I don't really know, so I'm leaving that as a possible cause for our problem.
Again : no. There are many parts, many models, many operating systems, many drivers, many users, many applications. Honestly, i'm really shocked that everything works so well since one flawed application or faulty driver is capable of destabilizing whole operating system.

Pjotr84 said:
What's your take on this? Is it just an excuse or is it a valid argument?
Details you gave don't make a good argument in my humble opinion. However that...

Pjotr84 said:
Lately there have been several PC releases that were in quite the shoddy state (Rage, Deus Ex: HR, etc) at release. According to many reviews' comments and forum posts this is due to the nigh infinite number of combinations of components in computers
...is a very good observation indeed. It seems that game developers aren't, but should be aware of that fact and should take all precautions before releasing their products.

For example : Rage case. Invalid drivers ? So what drivers they used ? How this game passed internal tests prior to release ? What machines were used by testers ? And how the f*k is it possible for other, similar games to work on those supposedly "faulty" drivers ?
 

oplinger

New member
Sep 2, 2010
1,721
0
0
JesterRaiin said:
For example : Rage case. Invalid drivers ? So what drivers they used ? How this game passed internal tests prior to release ? What machines were used by testers ? And how the f*k is it possible for other, similar games to work on those supposedly "faulty" drivers ?
Short answer? OpenGL.

Long answer? They were using drivers expected in the future (openGL 4.2) which in some cases, never gets an update because very few games on windows machines use openGL. So the game releases, and AMD and Nvidia are like "...what?" so the beta drivers have some support for OpenGL 4.2, but not full support. So the newest drivers with full support are not here yet.

...or something similar to that.
 

Jimbo1212

New member
Aug 13, 2009
676
0
0
The only reason is due to half baked ports from lazy devs who are designing the game for old hardware and their cheap improvements lead to major bugs.