Affects in the U.S

Before WWI America had been Isolationist, it had been that way since George Washington was president. This was the first war for the United States in which they got involved over seas.

Back at home with the men away at war, there were many job opportunities left for women to take. Women took the chance and started fighting for equal rights in America. With industries booming, women were able to spend less time at home, and more time out in the workforce. The picture of the American family changed as the women changed roles.

Throughout the war, technology advanced as well with all the new weapons being used. The new technology also has a role with women working. Since new machines like the washing machine were invented, they could do the jobs around the house that the women were responsible for.

WWI also had an affect on education. The government saw importance in having teachings of nationalism and patriotism. Programs that were focused on these teachings were used in public schools all over the country.

After the war America saw importance in international peace and understanding. The war caused casualties and it was something that America did not want to face again. International peace was important to maintain.

In the end, many of the kids that grew up during the war, were exposed to war propaganda and later grew up to be in WWII. Many Americans after the war held negative feelings towards Germany for the role they played during the war. After the war women in the United states were also granted suffrage, as their role at home changed.