- The hallucination rate for any large language model is 100%. That's the only thing they generate.
- "As their math skills have notably improved, their handle on facts has gotten shakier. It is not entirely clear why." it's because they fundamentally don't understand what a "fact" is; ai cannot discern whether a statement is true or false because it doesn't think, it doesn't verify.