I feel a bit like a kill-joy, but my DC are getting excited at the prospect of Hallowe'en (yes, it's become much more commercialised than it was in our day, but that's a whole other thread).
We've bought the walking bloody hand, and a dancing skeleton, but I think the fact that witches are still portrayed as scary hallowe'en figures misses the point that in fact this represents the murder of countless women whose only crime was to have knowledge or skills which men felt threatened by.
I'm by no means a historian, so someone will probably put me right, but my understanding of 'witches' were that they were often women who had a bit of medical knowledge who were vilified because they had 'too much power'.
If that's the case, why is there no discussion about what the murder of all these 'witches' actually meant to the role of women in society?