If something is wrong with you, or you think something is, do you notice things in the media regarding it? For example, I have a 6 month check up next week after having treatment on my cervix and all I notice are articles about cervical cancer. I feel like I'm being told something. Also when I was pregnant I noticed horror stories which I just avoided. Anyone else feel this way?