Hi ladies
I found out a few days ago I'm pregnant, so very early days. It's a very much planned and wanted baby. However wherever I seem to look at the moment, there are articles about how awful it is having a baby - it's making me cross. (See for example www.the-pool.com/life/parenting-honestly/2016/28/despatches-from-the-school-gate-liz-dashwood-on-newborns).
Do women write these things because it really is that bad? Or because they feel superior that they're going through/been through something other women haven't?
It's making me very apprehensive and I'm now wondering what on earth I'm doing. I have no doubts that it'll be difficult at the beginning, but can someone reassure me that the good outweighs all of this? Otherwise why do people do it (and more than once!) at all?