Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You’re going to get poor information presented with equal certainty as good information, though. And when you ask it to correct it, more bad information with a cheery, worthless apology.


The ability to spot poor information is what keeps the end user a vital part of the process. LLM's don't think. [Most] humans do.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: