Blaster395 said:
And by lazy, I mean lazy in optimising the game to run fast. Graphics have not realy improved in the last 2 years or so, but system requirements still go up, and it is often because its not optimised, has memory leaks, or other bugs.
For example, Black ops runs slower than Modern warfare 1, even though they have about the same graphics.
Are they just getting lazy because they no longer have to bother with optimising since PCs are still getting more powerful?
I know exactly what you mean, dude. I have a modest laptop but can run Starcraft 2 at a high framerate on nearly the lowest settings and it still looks amazing. I was also able to run Modern Warfare 1 & 2 on average settings with a good framerate, and they also looked incredible. However, World at War and Black Ops ran like ass, even on the lowest settings.
See also GTA IV: crap on a stick. Bully: for a game from 2006 (2008 on the PC, but the graphics still look more like San Andreas than GTA IV) it lags terribly if I dare turn the shadows on. It's really annoying, but I'm doing the best of it.
Graphics aren't that big a deal for me and I can live with low settings, but what really ticks me off is that console ports tend to have very, very few graphics customizability. Would it kill developers to add more settings in their games? I got Mass Effect 2, turned everything down to minimum everything I could and it still ran really poorly. A few Google searches and config file editings later and the experience improved greatly. Why exactly couldn't they add the settings in the options menu in-game instead of forcing me to muck with the code?