Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Rhythm gaming is a big one for me. All other display devices commonly used for gaming have at least a small amount of lag between input and final display. I have a midrange LED set that gives me something like 12ms lag calibration when I play Rock Band 3 on it. It's a very slight difference, but it is noticeable, and I'd much rather have the 0 calibration that a CRT could provide.


> and I'd much rather have the 0 calibration that a CRT could provide.

Would it really be 0 though? Assuming 60Hz, the bottom of each frame is scanned out 16ms after the top. Assuming that Rock Band 3 renders an entire frame before displaying it (definitely true) and that it actually renders at 60FPS as opposed to rendering at a higher resolution and tearing (definitely true on console, might not be true on an emulator?), the video latency will range from 0ms for the top of the frame to 16ms for the bottom, for an average of 8ms.

Admittedly, I don't know what Rock Band 3 calibration numbers actually measure, e.g. whether they already take into account factors like this.

If you can manage to render at a high frame rate then you could reduce latency by tearing, but at that point, I feel like you're leaving a lot on the table by not using a 240Hz OLED or something, which can show the entirety of every frame.

Supposedly OLED has comparable response times to CRT anyway. The article says that OLED gaming monitors are unattainable, but it's 5 years old and things have changed.


The game is pretty good about taking things like that into account. Even aside from that, a CRT generally adds no lag of its own, displaying the scanline as soon as it begins receiving it, assuming a stable set of sync pulses.

The company that made Rock Band, Harmonix Music Systems, has amazing beatmatching technology that has been used in countless games and is even available in the newest versions of Unreal Engine, thanks to being currently owned by Epic Games. They've been developing this technology since the mid 90s.

The main difference between CRTs and other display types, is that the other display types are all largely sample-and-hold - they display the whole frame line by line, hold it for an amount of time, then begin the process again. They generally lack the decay that happens with a CRT, where the image fades by the time the next frame is to be drawn. Some displays can do black frame insertion, but this is a poor substitute. It requires double the frame rate of the intended output, and it reduces effective brightness. I'd like to see an OLED display that could simulate the phosphor decay on a per-pixel basis, to better simulate the scanning process of a CRT.


I agree that Rockband almost certainly does render a whole frame before scanout. So, yes, there's necessary latency there between the input processing and game logic and everything and the output. Higher refresh mitigates that, but that's not really an option on the kinds of gaming machines Rockband runs on; hopefully it's all locked at 60Hz, because that's what TVs run on (unless maybe ew PAL, but ew), and all the machines have a 240p mode, if they're not cool enough to run 480p or better.

But modern displays add unnecessary latency on top of that. HDTV CRTs often did too. Input devices sometimes add latency too. It's really not too awful as long as it's consistent, real musical instruments have latency too (especially pipe organs! but organists manage to figure it out). It's not as much fun if you need to hit the note when it's visibly far from the 'goal' section, or if the judging becomes visible a long time after the note --- shortening the feedback loop is good.


Rock Band's core game logic is fairly well decoupled from frame rate for the most part, actually. I usually run Rock Band 3 (with a fan mod called RB3 Deluxe installed) on an emulated PS3 at 75Hz and it runs perfectly. The only real issue is certain venue post processing effects behave weirdly at high frame rates (like the handheld camera shake effect) but with the aforementioned mod we can disable that.


An analog CRT is as close to zero latency as the source (in this example, Rock Band 3) can provide[0].

More-modern displays (obviously including LCD and OLED, but also CRT displays that themselves have a framebuffer, as was also a thing) always add additional latency.

0: It's not like we can just somehow convert an existing closed-source thing like Rock Band 3 into a 240Hz source.


CRTs far exceed 60Hz though. This FW900 for example goes up to 160Hz.


Look at a Beatmania IIDX Lightening Cab or a Sound Voltex Valkyrie cab. Modern flat panels are are providing better rhythm gaming experiences than CRTs ever did.


I don't think this is as simple as that (and it's not entirely the display's fault). 120hz iidx is a big upgrade over the previous lcds but I'm not sure it is over the games that ran on custom hardware that wasn't just a Windows PC.

I've got a firebeat at home that I play Pop'n on and it feels magical to play with absolutely minimal difference between when the audio and visuals are to where they game expects you to hit notes. Same cabinet but with a windows PC plugged in for modern games feels worse thanks to all the extra latency you get going through Windows (it's especially bad with the audio).

I know modern iidx and sdvx use newer audio apis that are lower latency but it's still more delay than the pre-PC systems imo.


> I'm not sure it is over the games that ran on custom hardware that wasn't just a Windows PC.

It is. I've played on both modern ligthening cabs, and on a variety of old official hardware setups in the CRT and Twinkle era. I'd rather play on a Lightening.

Specifically there is a source of lag even greater than the display lag introduced on almost all of the official LCDs. The input lag, or lag between a button being pressed and the game engine registering the input, is critically important in keysounded rhythm games. The bio2 board introduced in the IIDX25 hardware upgrade made a bigger reduction of this lag than the lag that was introduced in official LCD monitors.

The "custom hardware" or Twinkle system used by Konami for IIDX 1-8 is essentially modified PS1 hardware. It's really not much different than building ontop of Windows Embedded with the amount of custom hardware inside the cabinets.


I can definitely agree on this point. Same for the newer Taiko versions utilizing a 120 Hz panel as well.

High refresh rate displays have been gaining a lot of traction in the past few years with 1080p or 1440p displays running at 120 or 144 Hz being affordable.


I've played modern games on CRTs through HDMI->VGA convertors, and still felt lower latency than LCD. OLEDs will eventually catch up, I think, but LCDs are always going to have some lag.

And it's not the refresh rate, it's the time from input -> display picture updates. With a CRT that can happen during the current field being displayed, but it will take at least one frame for any LCD.


This is just not true.

You can update an LCD in the middle of the screen just the way you can for CRT.

The only hard limit to the latency of an LCD panel is the finite amount of time that’s needed to flip the fluid in the LCD cells. There is nothing that requires a full frame delay.

Most LCD monitors have a frame buffer to look back in time for things line overdrive compensation, but you can easily do without. In fact, some of the monitor scaler prototypes that I have worked on were initially direct drive because we hadn’t gotten the DRAM interface up and running yet.

Desktop LCD panels themselves typically don’t have memory, laptop panels do but that’s for power saving reasons (to avoid the power of transferring the data over a high speed link and to allow the source to power down.)


> You can update an LCD in the middle of the screen just the way you can for CRT.

And in fact this is such an annoyance there are multiple different ways people try to deal with the screen tearing, between V,G,and FreeSync.


That monitor scaler without DRAM was the prototype of the first G version. ;-)


sub-6ms TVs are common and many, many gaming monitors have 1-2, maybe 3ms delay. If you only need 1080p, it doesn't even cost much.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: