In today’s Guardian there is a front page article ‘Fears over AI errors in social workers’ records’.
It describes an analysis of AI transcription services, performed by the independent Ada Lovelace Institute, of notes of meetings between social workers and clients across 17 English and Scottish councils.
The AI tools are Microsoft Copilot and the proprietary Magic Notes, which costs councils between £1.50 and £5 per hour of transcription.
Many serious errors were turned up, including fictitious reports of suicidal ideation and the inclusion of lots of absolute gibberish. The AI tools could not cope with accents.
The report highlights the potential for serious harm to clients from errors of both omission and commission by the AI.
These transcription services are used across
dozens of councils, having been championed by HMG last year.
This is not exactly a cutting edge application, or shouldn’t be. Whilst the potential for harm to clients is the primary consideration, it is galling that councils on the edge of bankruptcy or worse are paying for this garbage.
In the absence of objective, third party analysis, we simply don’t know how much of this kind of thing is happening. I’ll put my £££ on more rather than less.
None of my criticisms detract from what AI excels at - anything that can be constructed as a well defined problem in pattern recognition. Protein folding - thank you, @BridgetJonesDaiquiri - and various aspects of medical analysis are two outstanding successes.