Mainly a lurker over here, but I'm starting this thread in the hopes you can help. Someone I work with (a nice someone, by no means a jerk!) came out with "Well I think women just make better parents. It's genetic, that's why women have looked after the children for thousands of years." He doesn't think that means that women shouldn't work. When quizzed, he takes the very reasonable view that each family should do what's best for them. He thinks that men who make natural parents are a tiny lucky minority and not the norm.
He means the "women make better/make natural parents" thing as a compliment. Thing is, it's not. It's sexist, and I'm really stuggling to put into concise words, why it is. Help!
I have tried arguing the point with him, but always seem to get stuck. I won't go into it all here or this post will be an essay, I just didn't want to drip feed.