The answer is not really all that cut and dry.
Most of the problems a PC user faces are self-inflicted and can be prevented or easily overcome with a bit of knowledge. Most of the problems a console user faces require advanced degrees in electrical engineering to overcome (or a call to the manufacturer if you're in a hurry).
Really, the biggest issues with PC's and games are hardware compatibility and cost. There were many times, especially as separate video cards became commonplace that I would have to stuggle for hours if not days trying to get a game to work correctly (most notably, Unreal). Cost is the biggest issue though it seems. Sure, console games might not look quite as good as a PC game, the simple fact that you can have an incredibly similar experience on a console as you can have on a PC is enough to make many former PC gamers throw in their lot with the unwashed masses that call consoles home. Why spend thousands of dollars on a computer to play a newest, shiniest FPS games when you're almost certain to have the same game on a console at a tiny fraction of the cost? And the worst part is, it seems the trend is worsening. A decade ago, a standard home computer could be expected to play most games produced around the same time as the computer. Now, you can dump cash into a computer that's well suited to running MS Office and playing solitaire, and woefully underpowered when you try to play modern games. Look at Crysis - if you wanted to get the graphics promised in all the literature (and let's be honest, graphics were the ONLY thing the game was really promising in it's ads) you had to possess bleeding edge hardware when the game was released. Even now, many months after it's release it still takes a substantial investment in computer hardware to run the game at it's maximum settings.