Nothing can impact society like war. War can be viewed as noble and just, or cruel and inhuman, as well as everything in between. War affects everyone in society whether they are fighting in a foreign country or waiting at home for a loved one to return. War is an indispensable part of civilization; found at every chapter of human history. It is the culmination of the basic survival instinct when provoked. As has the technique of battle; society's view on war has changed as well. Today the act of war has become almost shameful, whereas in earlier eras war was glorified and heroic. American society's view on war has changed also. Our history, even as a young country has seen a great deal of conflict.
We've come a long way since the early styles of warfare seen in the American Revolution and the Civil War. World War I was the first war where the United States proved to the world we were a formidable power. Made up almost entirely of immigrants, America sent to Europe an armed force the fight against the Germans. The war brought a divided nation together as one. It represented pride and unity. The World War I victory portrayed the United States as a world superpower. Americans living in the early 1900s saw the war as a "just cause", and supported our actions abroad. World War II came just 20 years later. This time it was a single attack that drew us into the war. The Japanese attack on Pearl Harbor led us into World War II. The country furious with the cowardly assault; rallied, and stood behind the President's decision to send troops to the South Pacific and Europe. The American view on the war was very supportive. The country went to work manufacturing equipment offsetting the economic failure of the depression the United States was dealing with previous to the war. Engaging in the war in Vietnam brought a whole different set of "American Views" to the topic of war. This time the country did not