I believe that any country that has ever been colonized, occupied, or taken over another country, is an imperialistic country. After America passed of the Monroe Doctrine, the level of nationalism, patriotism, and greed increased. The document stated that European countries could no longer colonize areas in the West. This proved that Americans were ready for change and a new level of power.

America is known for invading territory and claiming other people’s land as it’s own. America did this very thing in the Mexican war. Americans seized an enormous amount of land from the Mexicans which stretched from Texas to California. America goaled to seize land and make advancements to their newly claimed territory; such actions reflected the power that America had the reputation of having. Now that America owned the Mexican territory, they could no longer move west since California was the furthest west they could go. This sparked a sense of greed in the Americans which urged them to occupy Hawaii. In the 1890s the United States had military presence there and occupied it for thirty plus years. They stripped Hawaii of their culture and forced them to convert to Christianity. These events, and similar to these, has made America a world power.

It is safe to say that America is an imperialistic country; It has always been, and it will always be.