Hi
I'm a frequent lurker on these pages and have learnt a lot from what I've read so I thought this would be a good place to ask what people's thoughts were on how women and women's health issues are seen in medicine. Apologies if I seem poorly informed, I've only recently begun really thinking about feminist issues.
I'm a current medical student and a lot of things in my education so far have just really annoyed me and don't seem right. The norm for all medical examples and anatomy is almost always male and the whole characterisation of some problems as 'women only' just seems odd to me when there is comparatively so little emphasis on 'men's health', as though men's health is the default and women's health problems are a whole other category. Even diagrams and models of the female reproductive system seem odd compared to the male versions- why is the vagina always depicted as gaping open whereas diagrams of penesis are always flaccid?
I don't know if I'm being over-sensitive but I know that the medical profession has not always been known as a bastion of feminism so I just wanted to gauge other people's opinions. What are the main ways in which medicine/medical education are sexist? And why? Sorry for the length of the post, I hope it's not too rambly.