Here
The research concludes that women should continue to breastfeed as many other benefits for baby. So women should breastfeed, but feel bad and worry they are doing something that may make their child ill in the long term? What are we supposed to do with this kind of research, the chemicals we have in our bodies we have been absorbing since childhood. Any thoughts? I feel crap.