Call me crazy, call me anti feminist but - who doesn't think all men are b*stards?
it seems like there is a complete lack of empathy with the other sex, which, considering us females professing to be open minded and more sympathetic, it surprising.
so, who doesn't think men are to blame for the state of world, the universe and everything?
who, for want of better word, considers them just part of the human race to which we all belong?
for those of you with sons, including me, i'd hate to think of men as stereotypically 'male': i.e., dominating, unable to multitask and relate to others. 'cos thats a hell of a negative role for boys to brought up with.