One of the questions I had before is why do a lot of Americans continue to hate and blame America? I think it comes from the post-modern disillusionment this country has been living in since World War II. In art history I noticed, the focus of a lot of modern art became the theme of "death, decay, darkness etc," whereas Western art from the Renesannce focused on the eternal, the spiritual the beautiful. This preoccupation with death and decay is largely a modern phoenomena it seems from my study of art history. Then what really cemented the post-modern disillusionment was the Vietnam War, possibly. You got the liberal realists who were pessesmists, many of whom became liberal professors hiding out in academia. From the liberal realists, people then get further divided into identity politics, seeing themselves as separate minority groups who blame society completely for their problems. Seeing the criminal as a victim of society, rather than the criminal's bad behavior that got him/her in trouble is another symptom of post-modern thinking.

I know I've thrown out theory, but this is my way of trying to understand why the times we live in don't make sense and people continue to blame America. Life isn't perfect, it never will be, but what makes this country great, is that we believe there a better future is possible.