Why does there have to be written laws on the books to protect people from being discriminated from if the should have those rights anyway?
Ask yourself this question: Why should it be necessary in the United States, a developed Western nation, a nation where people are educated and claim to be good, caring people … why should it be necessary for states to legislate fairness? Why should a state have to make laws that say, “you cannot refuse to rent a house, or to employ a person simply because they are African-American, or because they are gay”? Why do we have to do that? Aren’t some things just common sense? Why do you care if the person working in your factory is gay or straight? They aren’t hitting on you, for Pete’s Sake!!!
Sorry for the rant, but this one galls me to no end. We learned from the Civil Rights era of the 1950s – 1960s, didn’t we? Didn’t we learn that all different sorts of people make the world more interesting? …
View original post 764 more words