"Imperial power" is a bit of a vague label. It tends to mean several things at once, so let's unpack it.
In the sense of "this country was built on conquest by force," the "imperial power" part of America actually predates the United States proper. The territories that would become the United States were imperial colonies, established by the great European empires of the 17th and 18th centuries. Much of the US Constitution and American governance generally goes back to England, history's largest and most successful imperial power, but vital aspects of American culture come from other imperial powers, such as France and Spain. Much of American culture comes from sources other than the old empires, but they were key influences on what the United States became.
In the sense of "this country treats conquest by force as a fundamental component of its culture, economy and politics," the United States has always been an imperial power. Even early in American history, the War of 1812 included an invasion and attempted occupation of British Canada (see reference), and the Mexican War was fought to claim new colonies from Spanish Mexico. Above all, from the moment of our founding until well into the modern age, United States history has been defined by an ongoing conquest by force of territories occupied by American Indians. The key concept is "manifest destiny," the belief that European-descended Christians had a God-given right, indeed a God-given duty, to conquer the North American continent, and that the cultures that already existed there were without value. Similar ideas were used as justification for other imperial powers, most famously in Britain with what Kipling called "the white man's burden."
If you're interested in when the system that exists now, within the borders that exist now, started conquering things outside those borders by force, the United States became an imperial power in the 1890s. That unsurprisingly coincides with the end of organized military conflict between the US Army and American Indians. In short, the United States started conquering things outside its borders—that is, the borders the US thought it was entitled to, according to manifest destiny—as soon as it was done conquering things inside them. Most historians link the beginning of this phase of American imperialism to the Spanish-American War. The Spanish-American War made the United States an "overseas empire" for the first time, one that included the territories of Guam, Hawaii (then the Sandwich Islands), the Philippines and Puerto Rico (see reference). Guam, Hawaii, and Puerto Rico remain American possessions today.
That said, it's important to remember that while the Spanish-American War resulted in the first overseas territories claimed by the United States, America as it exists now was built on conquest by military force. The United States has been an imperial power since its founding.
Comments
Post a Comment