Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I had it give me book recommendations on a particular topic. It straight up made up a fake book with a fake author and attributed it to a publisher. It did this with confidence. I even emailed the publisher to see if they had ever carried that book. Never heard of it.

The problem seems to be whether or not the content was generated via language modeling or via direct references. When it crosses into language modeling it is just making stuff up on the fly. This sometimes works for programming, but for specific things that connect to the real world it can be phenomenally wrong.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: