The History of The American Empire

The term empire might not technically be an accurate way of describing the United States of America. Because the goal of U.S. foreign policy is to protect the country’s own interests and territory, it is by definition not an empire. However, the United States’ influence of the rest of the world is still undeniable. Throughout the twentieth century, the country dominated the music, film and financial worlds, while also making headway in both the science and technology milieus. As with all things, what goes up must come down – is the United States’ empire headed for a crash? In this video, explores the rise and dominance of this world power.

You must login to access this feature


You must login to access this feature


You must register to a corporate account to download. Please login

Related Videos

+ see more