I don't have children or a detailed understanding of hormonal effects, so forgive me if I seem ignorant.
It seems to me that the work of raising children is designated as women's work, and that this is presented in our society as correct and proper, as women are supposedly more nurturing and in tune with children's needs.
Is there anything to back this viewpoint up? People seem to point to the biological argument, where a woman carries a baby, breastfeeds it and has the surge of hormones- how much impact does this biological aspect really have? Are you more naturally in tune with your baby as a mother who has just given birth to him or her, or is this a cultural myth?