Sorry if this is old ground - but I recently said something along the lines of "why would any woman not describe herself as a feminist" and got the reply that "the term has been hijacked by women that think "all men are bastards" which has alienated women and frightens men."
And I'm not quite sure what to say to that one, to be honest! 
I possibly wouldn't have called myself a feminist back in my 20's, it's only since I turned 30 and had children that I've started very strongly identifying with the label. Why has the term been given such negative connotations and why don't women (particularly younger women) want to associate with it?