America? As in the two continents (North and South) comprising of several nations, one of which is the United States thereof?
Sorry, pet peeve of mine.
It is funny sometimes, the rest of the world thinks that the citizens of the United States have any say over the actions of the United States government. Oh, we choose who makes the "decisions", but that doesn't necessarily mean that they are the one who ultimately make the decisions.
A brief, very loose description of American "outside" history:
For about a century and one half, the United States tried to stay out of other countries' business. Sure, it would fight for more land or amongst itself, but generally it go beyond things that directly effected the nation at large. Then this great depression hit and things started to suck for the United States. Things started to suck everywhere. One European nation decided that the best way to get out of depression was to go to war with everyone else, so they grabbed some friends and started on their merry way. It worked. So much that the other nations of Europe started to realize they couldn't stop it. They pleaded with the States to help, but the States had problems of it's own. Finally, just as it was almost over for Europe, the States thought, "hey, maybe this war stuff will help fix our depression problem". It did. The United States helped the "good guys" beat the "bad guys" and everyone was happy. Except that first country who found itself in a worse state then before and eventually pretty much the same thing happened all over again. The United States realized something, this war stuff really worked for them. Other nations figured out that the United States was pretty good at this war stuff and was willing to help out. Soon a couple nations over in Asia were having troubles, then some in the middle east. Sides kept switching but the United States tended to win no matter what because war was good for their economy, even if they were perpetually losing the actual wars. Then, everyone pretty much started to get along, for the most part. The United States realized something, they were nothing without war to fuel their economy. So they started picking fights, generally fights that were "worthy", but not exactly necessary. Then others started picking fights with them and the United States was like "why not?". Now the US is just revisiting old stomping grounds that they didn't quite finish off properly the last time.
In short, the United States became the world police, mostly because the world asked for it. The problem is, the United States became dependent on it, and of late doesn't know what to do about it.
That's my biased opinion at half past midnight where most of my history classes escape me.