My initial analogy was already weak, so I guess there's no point in extending it. They key fact here is that tokens are inputs to what essentially is an overgrown matrix multiplication routine. Everything "AI" happens few levels of scientific abstractions higher, and is semantically disconnected from the "moving parts".