Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How do they "lie constantly"? We are specifically talking about code here, not LLMs writing legal documents.


I've had the LLM "lie" to me about the code it wrote many times. But "lie" and "hallucinate" are incorrect anthropomorphisms commonly used to describe LLM output. The more appropriate term would be garbage.


Just a basic sanity check: did the LLM have the tools to check its output for lies, hallucinations and garbage? Could it compile, run tests, linters etc and still managed to produce something that doesn't work?


I've frankly given up on LLMs for most programming tasks. It takes just as much time (if not more) to coddle it to produce anything useful, with the addition of frustration, and I could have just written far better code myself in the time it takes to get the LLM to produce anything useful. I already have 40 years experience programming, so I don't really need a tin-can to do it for me. YMMV.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: