I am so tired of hearing things like, "What is this world coming to?" and "Man this world gets worse and worse every day."
A little perspective here...
When Columbus landed in the Americas in the 15th century the natives were considered "dogs". Not only was it okay to kill them it was almost fashionable to dismember them, take them as slaves, or kill their children in front of them.
It wasn't until the 1870's that states even began to create laws about domestic violence.
And up until 2005 it was legal for my boss to fire me for being gay in the state of Maine.