there was a thread about this the other day - not in a feminist context
someone on there said she hid the fact that she was sitting on a rubber ring after birth so as not to put anyone off.
I think this is a real problem - women should know what giving birth involves for a myriad of reasons - not least so you don't wind up in an emergency with something happening to you that you've never heard of.
the poster made that "no one would have kids" comment which I find so annoying - what other medical procedure would you go through where no one gives you full information?!
This thread is actually making me think it should be taught in sex education as well. It is shocking that people have so little idea what happens.
It's also a feminist issue in the sense that so many people think women are just brood mares, would be horrified if someone decided to have a caesarian or adopt children etc. What is wrong with letting women have access to information and make their decisions accordingly? It doesn't suit the idea of "uterus in high heels" as Gabby said in Desperate Housewives.