Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Intelligence" is a continuous process. Without a continuous feedback loop, LLMs will never be more than a compression algorithm we bullied into being a chatbot.

OpenAi as a mega-organism might be intelligent, but the LLMs definitely are not.

The "compressed capture of semantic relationships" is a new thing we don't have a word for.



Funnily enough, there is a mathematical link between data compression and AGI [1]. I believe a paper circulated some time ago that compared gpt2 to gzip, with interesting results.

[1] https://en.wikipedia.org/wiki/AIXI


More than that, in general, understanding and compression seem to be fundamentally the same thing.


I would say understanding requires compression not that it equates to it. Probably just semantics though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: