I think it may well be true that in order to succeed in a male dominated society, many women have felt the need to more forcefully express those characteristics deemed necessary to succeed in that male world: be more assertive, more self centred etc When that is the dominant culture then you need to conform to that to be accepted to some extent.
But as women have progressed and entered into many previously male dominated fields and occupations, there has also, over time, come to be a cultural shift, which has seen the development of different work & management styles and practices - which maybe reflect more the traditional' feminine culture of inclusivity, negotiation, compassion etc.
When people are free to be themselves in their entirety without the need to conform to gender stereotyping, then our culture and work practices become more rounded and healthier.