Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

By the same logic, a hallucinating LLM is also overkill versus just doing the simple task yourself and not needlessly adding risk to it.

The point still remains: let's see what the LLM delivered that the user actually used. Either it's legally binding and an appropriate use, or it's not fit-for-purpose.

Equally, why not an interactive form using conditional logic? No hallucination possible. Much more simple and reliable.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: