Man follows ChatGPT’s advice – and so poisons himself:

As described in a new paper published in the journal Annals of Internal Medicine, a 60-year-old man ended up coming down with an all-but-defunct condition known as “bromism” after ChatGPT suggested he replace sodium chloride, which is better known as table salt, with sodium bromide, a substance used in pesticides, pool and hot tub cleaners, and as a canine anticonvulsant.

We need to keep teaching people that LLMs are guessing machines, not answer engines.