Man follows ChatGPT’s advice – and so poisons himself:
As described in a new paper published in the journal Annals of Internal Medicine, a 60-year-old man ended up coming down with an all-but-defunct condition known as “bromism” after ChatGPT suggested he replace sodium chloride, which is better known as table salt, with sodium bromide, a substance used in pesticides, pool and hot tub cleaners, and as a canine anticonvulsant.
We need to keep teaching people that LLMs are guessing machines, not answer engines.
I see that the Generative Engine Optimisation grift is building up speed.
This morning’s links on the dominance of YouTube, the state of university education and Ghostly publishers…
Ignoring the fact that AI produces hallucinations does not make those errors go away: Google’s healthcare AI made up a body part.
Any long term use of GenAI needs a strategy for identifying and eliminating hallucinatory results. And that’s going to have a major impact on the claimed cost/efficiency savings.
Great, deep dive into the financial precariousness of the AI revolution: AI Is A Money Trap
Instagram finally gets the repost:
Oh, every time I have to use Google Analytics, I’m reminded of how throughly GA4 ruined it for editorial people.