We need to define imperialist firstly. If it is the homegenous sense, then yes and no. Historically speaking, the US has always seen anything in near it's borders as somewhat of a protectorate.
If we're speaking in a social/cultural/economic sense, then well yes and in fact it is quite well documented from both sides (US and non US) that they have "imperialist" views/tendencies. For example, the Coca Cola campaign of the late 1950s was first mandated by Truman then Eisenhower as an example to how American lifestyle was superior to the communist. Also, the pictures of the new suburbs of the 50s and well american cinema itself were all used as cultural imperialist tools. However, this DOES make sense considering they were in a time of war. There are more examples economically and well socially speaking as well. I don't have the time to list them all but they are there.
The question instead should be, is the United States as important or is it something else. Hell, are we giving that state to the south of us too much credit? I would say, yes. Right now, we live in an age of globalization and the question should be, who controls globalization? Is it something that is inherently organic? Or is led by a single country(ies) and then appropriated by other countries and a brand new entity is created? Because right now, the United States is anything but the power house. I believe we have moved entirely away from a two power structure or even a mutli power structure. We've moved into an age where we're all completely dependent upon another (i.e. if Japan suffers an economic set back (see: mid 90s) we all suffer; if Great Britain suffers an economic set back we all suffer and so on and so forth).
And the U.S. has acted as any empire has in the past thousands of years. Indeed it has blood on his hands (far too much for my liking) however the U.S. has not acted unilaterally in any occassion in my opinion. Rather, any action was done as an act to preserve this globalization movement which really began in the late 1870s and was an idea which was supported by Western Europe and North America and the version we see today is one in which has been influenced by not only the U.S. but Canada, Switzerland, Germany, France etc.
This silly talk of "living like France"; "war in Iraq" is just that silly. This isn't simply the United States acting upon its own self, it is the globalization movement itself which is DEMANDING it. And if anyone still believes that the United States is THE world power, then they are living a dream. Sorry.
I hope this made sense.