This may be a bit of a ramble, so I apologise in advance. I read quite a lot of popular fiction/crime thrillers etc. I am growing increasingly uncomfortable with stories where a woman is kidnapped/abducted/murdered and the descriptions of violence against the woman, both physical and sexual, even if it has a 'happy' ending iuswim. These are often written by women authors, which I find more problematic. On one level I feel it is gratuitous, but on the other hand, am I just seeing a reflection of society? If this is the case, there seems to be much more of this than exists in reality, so the effect is that it adds to the anxiety or feelings of vulnerability that many women already feel around these issues. I have also commented before on how similar things are played out on tv. Is this a problem, or is just me?