IamLEAM1983 said:
Yeah, because the 360 had the minor selling point of being able to be installed sideways. I've never been ballsy enough to try it, but one of my friends wrecked his Halo 2 because he forgot this one weakness. I'll admit Microsoft hadn't specifically marketed the 360 with the mention that it could be installed sideways, but it could've added that the console's disc drive was fragile. That wasn't specified.
...it's a disk drive. You generally don't move any electronic gadget that has moving parts while it's turned on. (This is where the "nanny-state" jokes come from, by the way)
Feel free to disagree, but I consider that it's the parent company's responsibility to clearly dictate what the hardware is able - or unable - to tolerate as working conditions. I'm well aware that my one argument alone doesn't make for a case presenting the overall unreliability of consoles, but consider the others that have been mentioned, as well.
You don't move it while it's turn on, and your disk will never be damaged. It's a pretty cut and dry solution. I get what you're trying to say but that's a really weak example. A set-top box is designed to be stationary. You wouldn't attach your little sister's tricycle to your dad's pickup truck, blast down the freeway, and then complain that the wheels on the trike got smashed up and that the company should have warned you not to do that, would you?
Would you be using this same argument if it was your DVD player? Or your TiVo box? No, you wouldn't. So why not drop this ridiculous point?
You say that console gamers never have to worry about switching hardware; that you don't even have to think about it. It's out of the box; it works. Fair enough, that's entirely true. However, there's a catch:
Yes, it does work out of the box, and no, there is no catch.
Once your console's shelf life will have ended, you'll go buy another one. You'll completely change your system for the sake of supporting new releases. In what way is this different from making the personal choice to change my graphics card or my CPU as needed after the same timelapse - or earlier still?
So, with simple math, the 360 came out in late 2005. It is now 2012. That's getting close to seven years. GTA V and Halo 4 are still slated for this generation of consoles, but won't come out till next year, which means there is at least a shelf-life of one year left in those games. So for the sake of argument, let's say the 360 will be finished by 2014. That means that a machine with no hardware changes (for the purpose of video game compatibility) over the course of 9 years, a machine that was able to play the very first 360 game that ever came out, can still play the very last game that comes out for it 9 years after it's launch.
Can your 8-9 year-old GPU/CPU handle that test of time for its games? Yes, the graphical quality is different, and yes, you can drop the settings to keep it as smooth as you can. But you really that that strategy will work that many years after?
I'm not talking about hardware failure and RRODs. I'm talking about the simple fact that the hypothetical 360 I bought in 2005 will still be able to play Grand Theft Auto V when it comes out in 2013, and I will never need to think about upgrading its internals to do that.
That's all I'm asking you admit.
See, that's another argument in favor of PC gaming: a form of scalability that's not limited to five-year increments and complete investments.
The current console generation is well past 5 years, just saying.
The PC platform offers more grounds for the bleeding-edge enthusiast to experience the latest releases at their very best.
I have one word for you:
niche. The majority of gamers are not concerned with the cutting edge graphics or lighting, they are concerned about the "fun-factor", which is admittedly a vague word, but it exists and graphics do not play a large part in it.
If you're more of a casual gamer like myself, this doesn't really matter. I've had the same graphics card for four years and I'm still able to crank out the latest titles at high resolutions. I'm likely to wait until the very end of my current rig's shelf life to change it entirely, which will admittedly cost me more than buying a new console.
Which is fine, my 4 year old 360 Elite (the one before the Slim) still cranks out the latest titles at their best as well (best being console best). The difference is that the thought about whether or not it will do that (crank out the latest games, that is) will never cross my mind.
However, this is where the argument of ergonomics comes in. I don't think anyone's mentioned the overall "feel" of a gamepad in their hands, in this thread, or how long their hands and fingers can stand to grip that control implement. Personally, as I've got a mild case of Cerebral Palsy, I hate gamepads because my unused fingers tend to cramp around them. Assassin's Creed's notorious "claw" finger position, the one you take when you're free-running? That's left my right hand sore for days, no joke.
I feel bad for you, and I can't imagine what that's like, but I prefer analog control, and always will.
On the other hand, I can use WASD as much as anyone else. I'm really at my happiest with the mouse-and-keyboard combination, and not because I'm supposedly more precise than a console player of similar skill. Far from it. The more I can use all of my fingers, the longer I can game without feeling like it's a form of Chinese torture.
And again, I sympathize with you, but your situation is unique.
Another argument I'm not sure has been brought up is general accessibility, especially when compared to consoles. This will sound odd, I know, but PCs are somewhat more accessible in some respects, such as in controller configuration and general modability. If an element in any given game gives you trouble, it's far easier to mod or cheat your way past it on a computer than it is on the PC. If you're worse off than me, physically speaking, and have a more limited range of movements, it's reasonably easy to focus most of a game's commands in an area that an entire hand can reach.
"Accessibility" here seems to mean something different to you than it does to me. I view it as how approachable a video game ecosystem is, and to me, the console ecosystem has always been warmer and more easy to approach than PC gaming.
Just a reminder, though, seeing as we both haven't checked out this thread in a while: I'm not championing for PC gaming. I'm not championing for either side of the debate. I've got a PC and all three consoles, and I've ample experience as an end-user with all of them. I'm biased in the favor of PCs, however, largely because I've never had any problems that a reformat couldn't solve, whereas the consoles are just starting to develop a history of hardware failures that reminds me quite a bit of dead motherboard problems or busted hard drives.
Until I see it making daily news and becoming a serious problem, I view such incidents as nothing more than poor quality control and inevitable failures among many successes. I don't see such things making headlines or rattling the blogs (but I do see increasing scrutiny with Xbox Live, which I can imagine is irritating to be a victim of)
In these cases, the only available solution is to ship the things and hope for a replacement or refund. If my computer dies out on me, I've got the satisfaction of knowing that the cause behind it can and will be isolated and repaired, while leaving the rest of my system untouched.
But if your hard drive goes, it's hardly as though "the rest of your system" has been untouched.
If my console craps out on me, I'll get an entirely new setup with its own potential little foibles.
That's a pretty pessimistic way of looking at it, and as I've said before, I'm quite convinced that console hardware failures are as bad as you make them out to be.