Sex is a natural, biological thing. It's also a beautiful thing. I think American culture is completely backward, in that we demonize something we all do (sex) and glorify something no one wants to happen to them or their family (violence). Why? What is wrong wrong with sex and the human body? Nothing!

Here's how I know we have it backwards: Would you rather have your child watch, in person, someone beaten/shot/stabbed to death, or two people having sex? If you answered the first, something is wrong with you.
dgrantmhs dgrantmhs
36-40, M
Mar 16, 2015