Is anyone else very sceptical about 'body positivity'? As far as I can see the reason many women hate their bodies is because we're abused and controlled for being female, and then blamed for the abuse.
Not controvertial that you'd want to make peace with your feelings about your body btw, I understand why women are eager to put down that burden. But in the face of oppression, 'positivity' seems irrelevant and shifts responsibility back to women.
I also resent the way it centres women's value in beauty and sexuality. There seem to be a lot of parallels with sex positivity. Hardly revolutionary.