Everyone is always deriding the US and focusing on its problems, all while ignoring the good things!
Compared to much of the world it's a pretty decent place!
Let me list off some good things about the US, just off the top of my head:
-Labor Unions
-Insurance
-Regulations on Corporations (I know they aren't perfect, but things could be SOOOOO much worse, it's not even funny)
-Right to vote
-Right to own firearms
-Right to protest
-Right to criticize the government/have aberrant opinions
-Right to own land
-There's plentiful food, water, and luxuries!
If you can think of anything else, feel free to add. Opinions/debate is (obviously) welcome, but this isn't an America vs. The World thread, so don't turn it into one.