Sam Halpert
Views are solely my own.
- Reposted by Sam HalpertAgain, @ft.com reporters or whoever else needs to hear this: "Hallucinations" are not the result of "flaws," they are literally inherent in & inextricable from what LLM systems do & are. Whether an "AI" tells you something that matches reality or something that doesn't, *it is working as designed*
- Reposted by Sam HalpertWe might have a deeper problem here and it's that nobody knows what education is