Not sure whether to post in Education or Health so I'll stick it in other topics.
Just been on a First Aid course and I'm amazed that this is not a compulsory topic in schools, just like Sex Ed is.
I can't help but feel that aside from this giving us all valuable skills that could save lives it might also go some way to instilling a modicum of shared responsibility for each others welfare in our society.
Apparently it is on the curriculum in the states and accident survival rates are higher as a result of it.
The fact that its not taught here seems loony to me.