Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I was just talking to someone about this, this morning.

I will use ChatGPT (generally) to help me solve occasional issues. I'll come across some conundrum, and ask ChatGPT for a suggestion, which it confidently delivers.

The first suggestion is almost always wrong.

I'll say something like "That won't work," or "That answer is deprecated."

It will say "You're right!", followed by one that is more useful.

I suspect lots of folks run with the first answer.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: