If this was going to be a filter that would block out all porn (if everyone could agree on what exactly that is) and only porn then it would be useful to protect children. Unfortunately I don't think it's achievable and I'd have to "opt in" - not because I want to watch porn but because I know it will block things that don't need to be blocked.
If every image that someone uploaded had to be tagged "porn" or "not porn" then it would be simple - that's not going to happen though is it? Who decides and who polices something tagged "not porn" when it actually is without having to manually look at every image? It's not a safety net.
The only way to block everything that you might find objectionable for children to see is to have a whitelist system - prove that your content is ok. We have this for my 4 year old.
Blocking keywords is problematic because there are so many false positives - financial advice websites trip gambling filters, names of body parts on medical sites trip obscenity filters and the Scunthorpe effect.
I know someone who has a "flesh" filter on their work internet. Basically if an image contains too many flesh tones then it gets automatically blocked and someone in their IT department has to look at it and "OK" it. I sent her a picture of my DD as a baby (fully clothed) and it tripped the filter. How much work does that create for someone to police?
If you've got unfiltered access at home and filtered at work then you've probably already experienced pages being blocked for silly reasons. It's only because you know (through your unfiltered access) that the site is innocuous that you realise that you were "missing" something. (MN, NHS website etc.). If it's all filtered then you'd just never know it was there.