More old box sorting. This is a card for a restaurant — Restaurante O Manuel — I visited nearly 40 years ago as a teenager with my family.

A little blast from the past - I just found my mid-80s Prince August fantasy miniatures moulds.
You melted metal in a little crucible and poured it in…
I’ve just discovered that there’s an academic conference about Warhammer.
This delights me no end.
Having a blast from the past this morning. I’m having a quick coffee at the Bridge Café. I used to be a regular here in the pre-pandemic days, when Journalism.co.uk held its training courses here.
I’m off round to corner to run an in-house training course, but it’s nice to revisit the past.

Mike Masnick on who goes MAGA:
It is also, to an immense extent, the disease of a generation—the generation that grew up online, that learned to mistake engagement for truth, that confused being heard with being right. This is as true of suburban millennials as it is of rural boomers. It is the disease of the algorithmically poisoned.
The same could be said to be true of rise of Reform in the UK.
Is the AI bubble about to burst?:
The neuroscientist Eric Hoel calls this the “supply paradox of AI”: “the easier it is to train an AI to do something, the less economically valuable that thing is. After all, the huge supply of the thing is how the AI got so good in the first place.”
Man follows ChatGPT’s advice – and so poisons himself:
As described in a new paper published in the journal Annals of Internal Medicine, a 60-year-old man ended up coming down with an all-but-defunct condition known as “bromism” after ChatGPT suggested he replace sodium chloride, which is better known as table salt, with sodium bromide, a substance used in pesticides, pool and hot tub cleaners, and as a canine anticonvulsant.
We need to keep teaching people that LLMs are guessing machines, not answer engines.
I see that the Generative Engine Optimisation grift is building up speed.
This morning’s links on the dominance of YouTube, the state of university education and Ghostly publishers…
Ignoring the fact that AI produces hallucinations does not make those errors go away: Google’s healthcare AI made up a body part.
Any long term use of GenAI needs a strategy for identifying and eliminating hallucinatory results. And that’s going to have a major impact on the claimed cost/efficiency savings.