Since CRT enthusiasts are increasingly complaining how little emulator CRT filters hit the mark, I’m pondering today that one way to make Truly Period-Accurate CRT filter would be an AI upscaling type of system. Emulators are doing a poor job because the approach they’re taking requires labor-intensively understanding every technical detail of how one of them worked. But if you instead created a reference dataset of input playbacks (perhaps the ones on tasvideos.org) running both on emulator without any filter applied, plus synchronized video recordings of the same input running on a high-quality film of a CRT, then feed that to one of the standard AI image-processing systems, I would expect that to have a great result.
AI upscaling is starting to get a bad rep among the archival community for “making up fake detail”, but that’s mostly because we’re talking about particular AI systems specifically trained to make up fake detail. If your training says that it should do its best to look like a period-accurate CRT, it’ll do that instead.
The biggest practical difficulties would be A) that “high-quality film” qualifier, B) reducing the latency to a <50ms level.