I was just thinking ... from reading threads here and there ... about how different my view of women's history is from what it was when I was learning at school (and we barely studied it really). But there's still so much I don't know about how women lived and what things were really like for them in the past - even the past that's living memory. I read stuff like those Jennifer Worth books about being a midwife in the East End in the 50s and I can't begin to get my mind around it. And I was really stunned to find that marital rape isn't a thing far in the past but only made illegal in this country in 1991. I was wondering what shocked others about women in history, or what you'd say we're not taught/told about, and should be? To me now, it feels as if women's history is a big part of what I want to know about, and feel I should know about, but it doesn't seem very trendy really!