Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

which means I could spend the rest of my life tutoring Claude and it’ll still make the exact same mistakes

This is temporary. The new more global memory features in ChatGPT are a good example of how this is already starting to decrease as a factor. Yes it’s not quite the same as fine tuning or rlhf, but the impact is still similar, and I suspect that the toolong for end users or local tenant admins to easily create more sophisticated embeddings is going to increase very quickly.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: