I have been sitting through these console vs computer (including the PC) platform discussions since the early 80's. So lets get a couple of things straight.
Console gaming has generally, always been a bigger market than computer gaming. Sorry, but it's true. But computer gaming platforms generally always had a hardware advantage, due to having more expensive hardware. So what you used to get were games unique to computers, not exclusives, but unique, as consoles of the time were not capable of running said games. Console owners had to wait for the next gen consoles to play the titles. But devs did not mind this as they were always looking to push the envelope.
And this is what has really changed and to go back to the OP's question, what needs to be better on the PC. In the past companies would have made games purely focused on was possible on high end PC's, which meant you may get titles consoles of the time were not capable of running. This does not happen now as devs will only make something that will also be capable of running on consoles as well. This is a change.
A common point in these discussions from console owners, 'is why would I game on PC the, as the experience, as far as graphics are concerned, is hardly any different to that on a console'. And they are right. Look at most multi-platform games and there is not a huge difference between say the graphics. But this is not for the reason console owners believe. It is not because consoles are so close to PC's now, but because PC games have been coded to run on consoles as well.
Even PC unique titles like the much vaulted Witcher 2 are not much better as they have been designed with one eye on potentially being moved to consoles later, hence it only being DX 9.
This is the thing that needs to improve the most. Dev's allowing the PC to be able to run to it's strengths. But Dev's at the moment are very console focused as they have worked out they can basically put out the same games each year with out much innovation and they are guaranteed massive sales. Who can blame them when modern gamers are not as discerning any more.
Console gaming has generally, always been a bigger market than computer gaming. Sorry, but it's true. But computer gaming platforms generally always had a hardware advantage, due to having more expensive hardware. So what you used to get were games unique to computers, not exclusives, but unique, as consoles of the time were not capable of running said games. Console owners had to wait for the next gen consoles to play the titles. But devs did not mind this as they were always looking to push the envelope.
And this is what has really changed and to go back to the OP's question, what needs to be better on the PC. In the past companies would have made games purely focused on was possible on high end PC's, which meant you may get titles consoles of the time were not capable of running. This does not happen now as devs will only make something that will also be capable of running on consoles as well. This is a change.
A common point in these discussions from console owners, 'is why would I game on PC the, as the experience, as far as graphics are concerned, is hardly any different to that on a console'. And they are right. Look at most multi-platform games and there is not a huge difference between say the graphics. But this is not for the reason console owners believe. It is not because consoles are so close to PC's now, but because PC games have been coded to run on consoles as well.
Even PC unique titles like the much vaulted Witcher 2 are not much better as they have been designed with one eye on potentially being moved to consoles later, hence it only being DX 9.
This is the thing that needs to improve the most. Dev's allowing the PC to be able to run to it's strengths. But Dev's at the moment are very console focused as they have worked out they can basically put out the same games each year with out much innovation and they are guaranteed massive sales. Who can blame them when modern gamers are not as discerning any more.