Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

> What would a "hallucination free" LLM do with that?

To me, there’s a qualitative question of what details to include. Ideally the most important ones. And there’s the binary question of whether it included details not in the original.

A related issue is that preference tuning loves wordy responses, even if they’re factually equivalent.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: