The event in other countries and wars we have fought in lands beyond our borders proved one thing - We are not united in a war that other Americans believed a necessary evil to tackle and participate in the interest of our democratic ideals and supremacy ? The Afghanistan "killing" of innocent civilians like children and women proved that bloody confrontations brought anguished and anger...then hate. I never believed in war because you destroy not build (should I make an exception with World War II ? My country was literally decimated when the Japanese attacked our capital cities and destroyed 300 years of history. Millions of my people were killed either as a result of war or the consequences of war like famine and diseases.
I traveled in many parts of the world and many people articulated that they donot like Americans because they are arrogant ( They thought I was not an American because I was not "white".). I became an instant apologist and defender of those Americans I know ( my friends )...humble and hardworking. Indeed I murmured to myself, " America is really divided." The good Americans and the "ugly" picture of the other Americans who believed, that there arrogance and pride will change the world. What do you think ?