I'm looking for some wholesome documentaries to watch on TV. Nothing about war or sports.
I find much of the TV series these days really toxic or full of violence. I've watched some great documentaries recently and I would like some more recommnedations