Can we talk about women and confidence? I've been thinking about this lately and it really strikes me this is a common marketing ploy.
Mascara, underwear, cereal bars -almost anything seems to be claiming to give women "confidence". Every career advice article claims we need more "confidence".
I am plenty confident, but I'm still discriminated against. Why are women supposed to need to be more confident in all situations?