I read and see so many threads and real life examples, where men and women feel the need to be “equal.” The man about to become a father, refuses to become the main earner, even when he has the means, and insists that his wife also work and contribute financially. Doesn’t this seem imbalanced to anyone, and that society is being brainwashed to accept this as the norm.
I have nothing against a woman wishing to work post-children, however, I don’t understand why society and some men put pressure on their wives to work, if she would rather stay home with the children. This has now become and expectation. If a woman is contributing financially, it is never really 50/50, as she is also doing most of the domestic work.
People condemn gender roles as though they are ancient, but seem to forget that, biologically and psychologically, women are naturally better caregivers to children. They are the ones pregnant, produce all these hormones, and better equipped to raise a child than a man. Of course, there are exceptions, but as a general fact, people seem to ignore this.
In view of all this, I believe more men should offer to be financial providers, giving women the option to not work after children, as childcare costs aren’t exactly saving them much anyway. Otherwise, it feels we are moving away from our gender roles, which may actually be more helpful in a marriage, than people make out.