I took the title from a reddit poster, which referred to a NYT article (I think) concerning the US government awarding a NY art school $2 million. I was initially surprised, until I read this small segment of the article
"...has argued for the reclaiming of American culture from an overly feminized and anti American elite".
I am fairly keyed in on US politics (who could have missed it!?) and am up to date with my feminism, but this kind of statement always comes across a bit vague to me. No one ever explains exactly what they're getting at.
I am going to presume here that they believe the arts are dominated by women, which is quite ridiculous.
But more than that, what does it really mean,? since men are quite evidently still holding on to the vast majority of global power, whether economically or in the arts and sciences. What is it about the feminine that they despise so much?
Why is feminine a dirty word in a good segment of our culture (it is still used as an insult amongst boys and men)?
I tend to believe that when people are marginalised it is because they pose either a real or imagined threat, to the status quo, to an idealism, etc.
But what is about the feminine that pisses them off so bloody much?