• 0 Posts
  • 6 Comments
Joined 1 year ago
cake
Cake day: July 2nd, 2023

help-circle


  • Sure, it’s hard to say whether a computer program can “know” anything or what that even means. But the paper isn’t arguing that. It assumes very little about how how LLMs actually work, and it defines “hallucination” as “not giving the right answer” with no option for the machine to answer “I don’t know”. Then the proof follows basically from the fact that the LLM-or-whatever can’t know everything.

    The result is not very surprising, and saying that it means hallucination is inevitable is an oversell. It’s possible that hallucinations, or at least wrong answers, are inevitable for different reasons though.


  • So I wrote a long-ass rundown of this but it won't post for some reason (too long)? So TLDR: this is a 17,600-word nothingburger.

    DJB is a brilliant, thorough and accomplished cryptographer. He has also spent the past 5 years burning his reputation to the ground, largely by exhaustively arguing for positions that correlate more with his ego than with the truth. Not just this position. It's been a whole thing.

    DJB's accusation, that NSA is manipulating this process to promote a weaker outcome, is plausible. They might have! It's a worrisome possibility! The community must be on guard against it! But his argument that it actually happened is rambling, nitpicky and dishonest, and as far as I can tell the other experts in the community do not agree with it.

    So yes, take NIST's recommendation for Kyber with a grain of salt. Use Kyber768 + X448 or whatever instead of just Kyber512. But also take DJB's accusations with a grain of salt.