Really? I've kind of seen the opposite. Plenty of movies portray things like evolution, lifestyles that intensely oppose religion, and the lack of God's existence in a positive light. I'd even go so far as to say the majority of media these days simply assumes their audience is not religious. Then there are films like, in more direct response to your question, Paul.
But when a Chronicles of Narnia film comes out, people go crazy because it was written by a Christian who wrote the story to closely parallel Christian values and biblical stories (even though the story stays entirely allegorical and not once brings those concepts into real life). Any movies made with distinctly Christian themes (not "this was a time period when people believed in God," but a film actually about Christian values and ideas) are banned to little Christian retailers and very, very rarely see theaters. And of course, Christianity being the most common religion in the US, there's even less of any other religion.
Really, that thing I said about most media assuming the atheism of its audience speaks volumes. Most things I see simply treat everything as though there is no god; it's not a matter of being about how there's no god, it simply assumes its audience already knows that. And that's a lot more indicative of an atheistic society than if there were a bunch of movies directly about atheism.
EDIT: Admittedly, most explicitly Christian films suck, but I think that's largely because of the lack of a mainstream audience (and thus, budget). Though part of it too is that Christian aesthetics over the last century or so have become plagued with the idea that in order for art to be Christian it has to be about God and salvation and the happiness brought by them. It's rather unfortunate, since that's simply not true and it's holding Christian art back in some areas, though novelists like Ted Dekker and bands like Red overcome this extremely well.