LLMs spew hallucinations by their nature. But what if you could get the LLM to correct itself? Or if you just claimed your LLM could correct itself? Last week, Matt Shumer of AI startup HyperWrite/…
another valiant attempt to get “promptfonder” into more common currency
Most likely you’d be an et. al. in someone’s footnote
1. Better known for other work
Now that is leaving your mark in history as an academic.
Probably, but I would probably also never have heard the phrase “generative AI”, so win?