Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

At companies where the average worker is less intelligent than the mean author of the mean piece of online content used in an LLM's training set, the output from the LLM might be more clever or more well-written than what the average worker at that org would generate themself.

At companies where the opposite is true, every LLM output feels like a shittier version of what an employee could have written, like the average Redditor's comment on any given situation.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: