Sex seems to be the centre of the universe these days. It is used to sell things all the time. People want to be sexually attractive. It is assumed sex is an essential part of any relationship. People talk as if they have a 'right' to sex, to frequent, 'good' sex, however you define that. Maybe men feel the 'entitlement' more but I think women feel it too.
Does anyone else find it a bit sad? It just feels the world is so focused on individual pleasure these days.
Please or to access all these features
Please
or
to access all these features
AIBU?
to think we make sex out to be more important than it is
289 replies
purpleangel17 · 17/11/2017 13:05
OP posts:
Don’t want to miss threads like this?
Weekly
Sign up to our weekly round up and get all the best threads sent straight to your inbox!
Log in to update your newsletter preferences.
You've subscribed!
Please create an account
To comment on this thread you need to create a Mumsnet account.