Why is practically every human society across all times, places and cultures dominated by men?
I have read that War on Women article that MillyR linked to. It's chilling. Why is it everywhere?
I would be interested in your thoughts, or maybe there is actually a simple, widely accepted answer that I could be pointed to.