Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
exasperaited
82 days ago
|
parent
|
context
|
favorite
| on:
Responses from LLMs are not facts
You’re going to get poor information presented with equal certainty as good information, though. And when you ask it to correct it, more bad information with a cheery, worthless apology.
geocrasher
78 days ago
[–]
The ability to spot poor information is what keeps the end user a vital part of the process. LLM's don't think. [Most] humans do.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: