I should start by saying I'm not sure if this is even a feminist issue, but I'm feeling uncomfortable and I'm hoping someone will be able to help me order my thoughts.
I was discussing with a friend from the US that a friend from the UK was considering a vaginoplasty as she feels that her vulva doesn't look 'right'. My US friend pointed out that if there was something wrong with it, surely a doctor would have said something. I told her that my friend hadn't seen a doctor about it yet. At which point she told me that shes had many pelvic exams as part of routine physicals (I should add, she's 19, so I assume these started as a teen?).
I didn't want to ask her much about this, in case I was being ridiculous and prude-ish, but it made me really uncomfortable.
Now, I understand that doctors see so many vulvas and vaginas that there's no need to be embarrassed. And I can confirm that I'm the quickest to get mine out for a doctor if there's a problem. But that's the key issue. I don't feel right about teenage girls being expected to allow doctors to poke around their healthy genitals 'just to check'. Something about it really doesn't sit right with me.
Especially as, in the US, often you're required to have a doctor sign off on your physical in order to get health insurance. So essentially, no matter how uncomfortable a girl feels, she has to agree to the examination in order to get 'signed off' and given insurance. Which I can just see leading a girl being almost forced into an intimate examination she doesn't want by her parents and doctor. (This is the main thought that bothers me. The possibility of a girl being told not to be silly, and laughed at for being shy, and essentially feeling like she has to agree - it seems a little to much like rape in my opinion).
I would really like to hear other people's opinions on this as I can't decide if I'm being unreasonable to feel so uncomfortable. After all, I'd have no problem with a doctor checking my eyes, and it is just another body part. Am I being ridiculous? Is this because I've been socially programmed to find my body shameful and secret?
Now to try and make it feminist, but I can't imagine many teenage boys being expected to allow a doctor to inspect their genitals for no reason. Are women not trusted to know their own bodies?
I realise that I've rambled quite a lot, but I'm not actually sure what I'm even asking. I'd be interested in hearing other people's thoughts on the issue.