I'm sorry but I really don't believe a 540 Hz or even 360 Hz LCD is going to be beaten in total end-to-end latency by any kind of CRT. I've yet to see any real world tests to the contrary.
And of course, modern OLED beats the pants off of both in that regard.
Is it though? Could you really say a 200 Hz CRT is going to beat a 540 Hz LCD with backlight strobing? Has anyone actually done thorough testing to confirm it? Motion clarity is made up of a lot of parts, it can't be ballpark guessed.
CRT has zero input lag. It is pure analog electronics with no buffering. "Racing the beam". Digital buffered displays are down to a few milliseconds but cannot ever be as low as CRT.
As an FPGA engineer that has developed custom monitors, I see no technical reason why an LCD panel cannot be driven with zero latency or buffering. It may just be that almost all monitors/TVs sold on the market happen to implement some type of buffering to have room for image enhancements etc. but it is not a requirement.
I don't see how you do a fair comparison. How do you even drive a CRT with a modern graphics card? Presumably you have some sort of digital to analog converter box, but wouldn't that add a touch of latency and defeat the purpose of the CRT?
When the analog signal enters the CRT, there are nanoseconds of lag before that signal causes the phosphor to be lit up on the other side. That is different to how a typical digital display works (with buffering) and that is where you get the zero input lag of CRT.
> presumably you have some sort of digital to analog converter box
Yes the CRT will display the output of that DAC in real-time. No additional buffering/latency after that conversion.
The Atari 2600 video hardware had no framebuffer and the code running on the CPU was racing the electron beam, updating the graphics mid-scanline in real-time while the electron beam was scanning that line on the display. That was about as raw and zero lag as it gets. The other 8 bit and 16 bit consoles also did not have framebuffers.
Another guy has commented here saying that in theory you could do the same with digital displays. In practice they buffer the input and add latency.
Depending on the display and the input type, some CRT displays do digitize and buffer the signal so that they can do some of the filtering and decoding digitally. The amount that they buffer is generally only on the order of a few lines, though.
CRT has zero input lag from signal to phosphor (maybe a few nanoseconds due to speed of electronics and the electron beam itself) but it will take a few milliseconds to scan an entire frame.
With vsync enabled I guess you could have effectively lower input lag from a well engineered 500hz digital display, when considering lag for complete frames.
If the system is fully engineered to race the beam then you can't beat zero input lag for CRT on mid scanline.
And of course, modern OLED beats the pants off of both in that regard.