I have been thinking about this for some time. I have often read that women like to see women doctors and of course for some this is tied in with religion and culture, for others maybe a history of abuse or a reasonable expectation that only a women would really understand the inner workings of women, whether it be gynea or even emotional/psychological problems.
However I have to date always had problems with women doctors.
I have come to the conclusion that some women who lack power, authority or autonomy under male domination seek to impose authority over other women. From my experience this happens when there is a perceived weakness such as illness.
I wondered if women really do empower themselves by oppressing other women because they feel so helpless themselves?