The United States declared war on
the German Empire on April 6, 1917, during World War I. The U.S. was an
independent power and did not officially join the Allies. It closely cooperated
with them militarily but acted alone in diplomacy. The U.S. made its major
contributions in terms of supplies, raw material and money, starting in 1917.
WWI was one of the most important events that happened to the USA and the world
too. After the war, USA has changed sharply. I found this website that give
more information about the USA WWI and a video explains what did the USA did in
the war and what they did after.
No comments:
Post a Comment