Why is it so hard to shift the idea that sex is dirty and bad? I know it's not a universal feeling amongst individuals but there's definitely still an element of that througbout society. The Victorians were hysterical about sex and chastity and religion has long said it's for procreation within marriage. But the Victorian era is long gone and much of society no longer looks to gods and religious institutions for guidance, so why is it so hard to shift this idea for some?
I've left my post a bit bare (pun not initially intended but now absolutely meant) so that people can run in any direction they like. I'm really interested to hear opinions!