I do not have children yet, but I am in a relationship with someone who has grown up being told he is the man of the family. I have seen it with his cousins children, that the boys have been told they are the men of the family since daddy left. Personally, I find this actually triggering because how dare you rip away a childs childhood from them with the immense pressure of now being a man when he is 6/7 years old? They see nothing wrong with it, and say 'charlie, you are the man of the house now, you need to look after your sister'. I find it so uncomfortable.
Now it makes sense, my husband was also told this growing up, and honestly, its had negative affects, Personally speaking, he has been told that the mans say rules, he has the final say and he needs to solely financially provide. things i do not agree with. i wondered, as i dont have kids, would you say this to your son? does it feel comfortable to something youd say? and what do u think the consequences are on telling a child this?