I had a bit of fun the other day, pasting some friends faces into famous movie scenes using an app on the iPhone.
I figured the results were pretty extraordinary and showed my wife who said immediately - "You do know this will be used to target women and children don't you?". Well - it had crossed my mind, but I must confess it wasn't "front and center" - I was more amazed at how well it could map my friend's faces into any kind of scene lighting along with facial expressions in a very convincing way.
That was a few days ago, and sure enough - this has today become a lead story on the BBC news (see link below).
It's (as usual) pretty lazy reporting on the part of the BBC to say that they had poor results. THAT app may be poor, but the one I was using did a remarkable job using some built-in famous movie scenes. With a paid subscription you can apparently use your own footage (I've not tested this).
Anyway, I mention this here, because most people are probably not aware that (notwithstanding the app the BBC tried) the quality of this is now way past "silly joke" territory. We're now in the realm of very convincing footage that could 100% be used to ruin relationships, facilitate blackmail, or bring mass humiliation upon an unwitting victim. It also only takes seconds to do, and works off one single photo of the victims face. It's easily within the bounds of pushing (e.g.) teenage girls into taking their own lives.
The big question, is what can be done about it?
I don't think this is one of those "let the market decide" issues. I think this technology is advancing by the minute right now (it's A.I. - so it constantly learns how to improve) , the best apps are extraordinary, and I now believe that what is needed urgently is legislation.
It wouldn't be popular, but I think the only option is to put in place legislation that makes it an serious offence to make a fake video of anyone without their consent. No doubt that's probably going to require a bunch of vague exceptions for "legitimate comedy or satirical purposes" but I can't really see any other option.
Interested to here from the legal minds (and others) on FWR around this or other options; because this conversation is already happening far too late for many, many women.
www.bbc.co.uk/news/technology-54584127