I need to write an essay on something that has changed since the 1930's, I've chosen to write about women's role in society because I'm interested in feminism but haven't really been exposed to the viewpoints as they appear on this board. (obviously equal pay etc but nothing as extreme as some of the opinions on here.)
I was just wondering your viewpoints on how women's roles have changed (apart from the obvious less housewives and more working) and what hasn't changed?
All viewpoints welcome :-)
(first time posting in feminism so sorry if any turn of phrase I've written is wrong/not feminist.)