This is a big jump ethically, but technically it feels like it's a hop away.
If we can do this for visual images, we could use the same strategy on patterns of thought - especially if the person is a skilled at visualisation.
"Feels" and "is" are quite different in these domains. Self-driving feels like it's 5 years away for 10 years straight, and biology is infinitely more complex than automation. See this comment from this thread [0].
> These techniques operate on the V1, V4 and inferior temporal cortex areas of the brain. These areas will fire in response to retina stimulation regardless of what's happening in the rest of your brain. V1 in particular is connected directly to your retinas. While deeper areas may be sympathetically activated by hallucinations etc, they aren't really related to your conception of things. In general if you want to read someone's thoughts you would look elsewhere in the brain.