I was reading a discussion elsewhere about what the patriarchy is. One poster thought it referred to the structures and institutions in society and the way they are used to enable men to dominate women e.g. legal system.
Another thought it was more than that and include the culture of our society.
I know feminists use the concept of the patriarchy a lot. But I am still not sure what is the common definition of the patriarchy? Or do different feminists have different views on what the patriarchy is?
Thank you.