US foreign policy has changed tremendously since the nation’s emergence as a world power in the 1890s. During the 1890s the united states began the process of imperialism were we gained many new territories through war and money. This policy of imperialism was ended when Woodrow Wilson was elected as president in 1912. Around the time Imperialism ended is when World War I began, Wilson chose to keep America out of the war and declared a policy of neutrality. But many policies were passed that unintentionally favored Britain and entered America in war. After WWI the United States entered a period of Isolationism. But this Policy was abandoned when WWII started ans the U.S joined. This war led to a cold war between the U.S and the U.S.S.R, who competed in an Arms Race to see who the wold power was. The U.S. Foreign policies has overall been good for the world, it has helped enhance and keep world peace. But these policies have not been that good for the U.S., many of these policies had a negative long term effect on American economics but they did help united the country as it rose as a world power.