U.S. Imperialism. Imperialism-Definition The Policy in which stronger nations extend their economic, political, or military control over weaker territories.