Anyone else love this show? What do feminists think?
Love the way it showed the women and children at the end.
Chilling domestic violence scenes that were terrifying but showed the multifaceted way women are trapped in and the risks they face in leaving violence partners.
I went out and read the book and loved the women coming together.
Brilliant actors - made me think that Reese Witherspoon and Nicole Kidman are truly empowered as producers which i don't think would have been possible a generation ago.
What do others think?