There's the age old argument over choice of platform that will never subside. A common aspect of PC gaming that tends to be a barrier to entry is the need to upgrade. The need for better hardware.
I consider myself a PC gamer primarily, but dabble in consoles for the exclusives and some other reasons. My PC is running pretty old hardware, but can run pretty much everything. (AMD Athlon X2 6000, ATi HD4890, 4GB DDR2 800mhz RAM). It's a common misconception that you need to spend loads on a computer and upgrade it every year to be able to enjoy PC gaming.
I'm not aiming to get into an argument so much about that point. What I would like to target is the PC games that actually do make the player need to get the latest and best hardware. The games that are poorly ported, poorly optimized. Is it the fault of PC gaming as a medium that these games will not run on reasonable hardware? Or is it the fault of the developers?
That's the main advantage of consoles. One standardised piece of hardware that will run everything with the shared logo. Would PC gaming perhaps benefit from some form of standardisation?
I consider myself a PC gamer primarily, but dabble in consoles for the exclusives and some other reasons. My PC is running pretty old hardware, but can run pretty much everything. (AMD Athlon X2 6000, ATi HD4890, 4GB DDR2 800mhz RAM). It's a common misconception that you need to spend loads on a computer and upgrade it every year to be able to enjoy PC gaming.
I'm not aiming to get into an argument so much about that point. What I would like to target is the PC games that actually do make the player need to get the latest and best hardware. The games that are poorly ported, poorly optimized. Is it the fault of PC gaming as a medium that these games will not run on reasonable hardware? Or is it the fault of the developers?
That's the main advantage of consoles. One standardised piece of hardware that will run everything with the shared logo. Would PC gaming perhaps benefit from some form of standardisation?