Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>LLM tech will never lead to AGI. You need a tech that mimics synapses. It doesn’t exist.

Why would you think synapses (or their dynamics) are required for AGI rather than being incidental owing to the constraints of biology?

(This discussion never goes anywhere productive but I can't help myself from asking)



It doesn't have to be synapses but it should follow a similar structure. If we want it to think like us it should be like us.

LLM are really good at pretending to be intelligent but I don't think they'll ever overcome the "pretend" part.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: