You literally can’t tell the difference in a 20ms delay. That is an order of magnitude lower than the neural feedback loop latency. You may think that you can, but studies don’t back this up.
> At the most sensitive, our findings reveal that some perceive delays below 40 ms. However, the median threshold suggests that motorvisual delays are more likely than not to go undetected below 51-90 ms.
By this study's numbers, 20ms is somewhat below the lower limit of ~40ms, but not too far below. 100ms would be easily perceivable - though, based on the other replies, it seems that VS Code does not actually have that much latency.
Don't confuse this with human reaction time, which is indeed an order of magnitude higher, at over 200ms. For one thing, reaction time is based on unpredictable events, whereas the appearance of keystrokes is highly predictable. It's based on the user's own keypresses, which a touch typer will usually have subconsciously planned (via muscle memory) several characters in advance. So the user will also be subconsciously predicting when the text will appear, and can notice if the timing is off. Also, even when it comes to unpredictable events, humans can discern, after the fact, the time difference between two previous sensory inputs (e.g. between feeling a keyboard key press down and seeing the character on screen), for much shorter time differences than the reaction time.
Of course, just because these levels of latency are perceptible doesn't mean they're a material obstacle to getting work done. As a relatively latency-sensitive person, I'm not sure whether they're a material obstacle. I just think they're annoying. Higher levels of latency (in the hundreds of ms) can definitely get in the way though, especially when the latency is variable (like SSH over cellular connections).
You're the one who said "you literally can't tell the difference". I agree to a point. It seems plausible that they were experiencing some other effect such as hitching or slowdown, rather than just a constant 100ms delay (which again isn't supposed to happen).
On the other hand, I just thought of one way that even a small fixed amount of latency can be a material obstacle. Personally, I type fast but make lots of typos, and I don't use autocorrect. So I need to react to incorrect text appearing on screen, backspace, and retype. The slower I react, the more text I have to delete (which means not just more keypresses but also more mental overhead figuring out what I need to retype). For this purpose, I am bound by the human reaction time, but editor latency is added on top of that. The sooner the text appears, the sooner my 'reaction timer' can start, all the way down to 0 latency. [Edit: And 100ms of latency can make a meaningful difference here. I just did a quick typing speed test and measured 148 WPM which is around 12 characters per second, so 100ms is one extra character, or a bit more.]
Also, latency might affect productivity just by being annoying and distracting. YMMV on whether this is a legitimate complaint or whether you should just get used to it. But personally I'm glad I don't have to get used to it, and can instead just use editors with low latency.
"Order of magnitude", so you're saying the neural feedback loop latency is >100ms? That seems obviously wrong.
Also you can absolutely feel the visual difference between 60Hz (~16ms) and 120Hz (~8ms), and for audio it's even more nuanced.
Just because studies don't back this up yet doesn't make it false. I imagine this is really hard to measure accurately, and focusing only on neuron activity seems misguided too. Our bodies are more than just brains.
> "Order of magnitude", so you're saying the neural feedback loop latency is >100ms? That seems obviously wrong.
Human neural feedback loop latency is a range that varies widely depending on the type of loop involved. Reflex loops are fastest, operating in tens of milliseconds, while complex loops involving conscious thought can take hundreds of milliseconds.
Short-latency reflex: 20-30ms. Signal travels through spinal cord, bypassing the brain. E.g. knee-jerk reflex.
Long-latency reflex: 50-100ms. Signal travels to the brainstem and cortex for processing before returning. E.g. Adjusting grip strength when an object begins to slip from your hand.
Simple sensorimotor reaction: 230 - 330ms. Simple stimulus-response pathway involving conscious processing, but minimal decision-making. E.g. pressing a button as soon as light turns on.
Visuomotor control: ~150ms, adaptable with training. Complex, conscious loops involving vision, processing in the cortex, and motor commands. E.g. steering a bike to stay on a path in a video game.
Complex cognitive loops: Brain's processing speed for conscious thought is estimated at 10 bits per second, much slower than the speed of sensory data. High-level thought, decision-making, internal mental feedback. E.g. complex tasks like analyzing a chess board or making a strategic decision.
A few years ago I did some testing with a quick Arduino-based setup I cobbled together and got some interesting results.
The first test was the simple one-light-one-button test. I found that I had reaction time somewhere in the 220-270ms range. Pretty much what you'd expect.
The second test was a sound reaction test: it makes a noise, and I press the button. I don't remember the exact times, but my reaction times for audio were comfortably under 200ms. I was surprised at how much faster I was responding to sound compared to sight.
The last test was two lights, and two buttons. When the left light came on I press the left button; right light, right button. My reaction times were awful and I was super inaccurate, frequently pressing the wrong button. Again, I don't remember the times (I think near 400ms), but I was shocked at how much just adding a simple decision slowed me down.