“It simply (!) has accumulated a wealth of information that it chooses the best answer for. Of course, that is very impressive but a thinking machine it is not.”
I’m very confused. I don’t think I’m different than this.
If I ask you a question that requires you to have a set of dependencies to solve you would need to understand the meaningful outcome of them as they were inputs into the next issue.
You understand the ‘thing’ not just the associated words to reply.
ChatGPT has zero knowledge of the world other than language.
Behind your thinking is a lot of heuristics built up from decades of experience, along with the ability to use logic in addition to the information you've picked up. ChatGPT still sucks with anything requiring logical inference.
Speaking as someone who used to think this stuff mattered, then realized they’d just been wasting their life, instead of isomorphically rearranging thinking patterns on the deck of the trendtanic.