From my experiences, it's a mixed bag on whether laptops and gaming really correspond with one another. If you have a tablet, that makes it easier for those of us who prefer using a mouse as opposed to a touchpad, and then it's a matter of if your laptop plays the game or not.
Case in point: recently, I picked up Fable: The Lost Chapters, and Painkiller: Triple Dose. I installed them with ease and when I went to play them, problems popped up all over the place. Most of them involved my chipset (from what I can tell).
When I tried running Painkiller, it gave me a message about DirectX, and when I tried to figure out what the hell it was on about (since I do believe I have the most recent DirectX update for my system) through the amazing resource that is Google, I found that the people who had the same problem often had a video card that didn't support the game. As you can imagine, I was pissed. This wasn't the first time it's happened to me, either. You'd think I'd get the point and stop buying PC games that were made after 2005, but I didn't.
A similar occurrence came about when I tried installing and playing Oblivion some time ago. I finally caved and bought the 360 version.
Fable installed without any real problem, as well. It started up, and I actually got in about 20 minutes of play before the game slowed down, and then stopped altogether. I'd be left staring at the same screen for the better part of two minutes (maybe more) before it moved another frame.
On the other hand, I have a friend who has a more modern computer (which is a rather weird way of putting it, she only got it a year after I bought mine, and got it through connections), and it runs Oblivion flawlessly, as well as Fable. So for her, gaming and laptops mix pretty well, but for me, this $700 turd is only good for surfing the web and listening to music. Oh, and any Blizzard game from earlier this decade, and the late 90s.
Guess that's what I get for going with the relatively inexpensive model, right?