DH and I have watched Forks Over Knives and What the Health (both on Netflix) this week and it's really convinced us to overhaul our habits to a plant based diet. Coincidentally, he had a client at work who is an oncologist confirm that the info in the documentaries is true, but they also said they're "not supposed to say that" as they currently work for a pharmaceutical company.
Anyone else ever see these films or want to watch/talk about them?