Not sure if this is the right category but wanted people's opinions.
I've noticed I feel uncomfortable watching shows or films that show just female nudity.
It really bothers me when it just feels like it's done for the sake of it and the male cast members aren't equally as nude or not nude at all. I don't feel bothered when it makes sense to the story or is artistic as such.
I've put it down to being pregnant right now and not feeling attractive but I also notice I feel its a lot to do with me feeling like it's not fair that often it's just women and male actors you might see a bum but i don't feel it's the same to a women having her breasts out or do you think it is?
I read an article the other day I think it was in the metro about Emilia Clarke and how she really struggled and felt pressured into nude scenes in game of thrones and now I wonder if all these young actresses feel like that at times.
I'm also young in my 20s but feel as I'm getting older I've noticed just how unequal stuff like this is in the media and TV and film.
I'm not good at articulating my point of view. Do I need to work on my self esteem or am I right to feel these things and not alone in that.