I am on maternity at the moment and will be returning part time. I have just found out that the other part of the job has been given to a man and I have had a really negative reaction to it which I am trying to make sense of.
The dept I work in is a new one that I set up and even though I have yet to even meet this person I just feel that because it is a man he will try to take over and dominate. I don't feel that I will be able to connect with him in the same way as I would a female. I also think he will be taken more seriously (I have experienced some sexism in this job). I am not happy that this is the way I feel, it's unfair, I haven't even met him and what on earth does that say about me? I didn't expect that I would have this reaction. I just feel a deep sense of unease over it and it is really making me question myself.
I am assuming that other women wouldn't feel this way so where is it coming from for me? Is it because I am reading lots of feminist stuff here and books and my perception of things has shifted? Perhaps it is because I have had some very negative experiences with men in the past? Has anyone else experienced anything similar?