Ask the average man-on-the street what the most important country in the history of American foreign affairs is, and you will probably get similar answers. It could be Britain, due to America inheriting her economic, cultural and political institutions. It could be the Soviet Union, due to the Cold War shaping American militarism for nearly a century. It could be Israel, due to their influence over post-Cold War geopolitics. These are all good answers. But the most important country in American foreign affairs rarely gets discussed today, and it’s located 700 miles off of the Florida coast. Foreign policy often affects domestic policy. This is why the most important country in the history of American foreign relations is Haiti.
Continue reading