Just a little beginnings of a thought here, but wondering if we have become so used to seeing a black president in films (eg Morgan Freeman) that the idea became more acceptable, and normal to all.
It makes sense to me, that we need to see media that reflects our ideal society rather than reflects the negative areas (as is usually the justification for yet another black teenage gang member knifing someone on Casualty).
What do you think?