Do
you think the United States is/has been imperialistic?
im·pe·ri·al·ism
noun \im-ˈpir-ē-ə-ˌli-zəm\
The policy, practice, or advocacy of extending
the power and dominion of a nation especially by direct territorial
acquisitions or by gaining indirect control over the political or
economic life of other areas; broadly : the
extension or imposition of power, authority, or influence.
That
is the Merriam Webster definition of what imperialism is, and before
I can discuss if the United States is imperialistic, I have to know
the definition of imperialism.
Next,
let's evaluate a pro-con list of American actions that may be viewed
as imperialistic.
PRO
Imperialism:
- The
Annexation of Hawaii- Under the leadership of Samuel Dole,
the US took Hawaii by force because we didn't want the Queen
to enforce a high sugar tariff when she tried to put in place a
stronger monarchy.
- Manifest
Destiny- The justification for taking over an entire
section of a continent for the personal gain of the United States.
Like the "White Man's Burden," it is simply a phrase used
to justify actions of imperialism.
- Superpower
Status- Arguably,
the United States remains the only superpower in the world today.
The term superpower, as defined by Andrei Gromyko, is “a
country that has a say in every corner of the globe and without
whose say nothing truly substantial can be achieved in any such
corner.” The United States certainly does fit that description.
CON
Imperialism:
- Democracy- The United States interferes with other countries' business when they could just stay out of it, but why miss the opportunity to spread democracy to other nations?
- Self-preservation- The United States isn't trying to go out and conquer lands, they are simply trying to protect their own country. When the United States sent troops to the Middle East, they did so as a precautionary move to prevent further attacks to United States citizens.
- World War II- At the end of the war, lots of countries were in shambles from the devastation that the war brought, but the United States came out on top. We had the opportunity to take advantage of many countries and gain land, but we chose to provide assistance and help the countries build themselves back up.
With all of that information in mind, is the United States imperialistic? I don't believe that we are at this moment today. At the beginning of our development, yes, we could be perceived as imperialistic. We conquered all of the land that is today the United States of America. It hasn't always been under our control, and most of the land we took by force. That in mind, are we still taking part in the same actions today? No, we aren't. We are trying to spread our ideas and beliefs, maybe more than some countries would like, but we aren't trying to take over the world.