cross-posted from: https://lemmy.zip/post/26533086

Linux kernel 6.12 is one of the most significant releases of the year, delivering a feature nearly 20 years in the making: true real-time computing.

  • thingsiplay@beehaw.org
    link
    fedilink
    arrow-up
    18
    ·
    1 month ago

    As I understand it, most kernel operations can’t be interrupted (i.e., they’re non-preemptible). But PREEMPT_RT allows high-priority tasks to interrupt lower-priority ones near-instantly. For specific types of tasks this improves response times and thus performance.

    I never looked into the details of realtime Kernel. I know it is or was used for professional realtime audio mixing and recording and such. Besides that, if this improves response times, would gaming benefit from this? What are downsides for using a realtime Kernel for gaming?

    • CameronDev@programming.dev
      link
      fedilink
      arrow-up
      34
      ·
      edit-2
      1 month ago

      Realtime doesn’t necessarily mean low latency, it means consistent latency.

      So if the latency from and input takes 1s, that is realtime, as long as its always 1s.

      Typically for gaming you want the lowest latency possible, and at least historically, that meant not realtime.

      Edit: Some examples with made up numbers:

      Airbag: you want an airbag to go off EVERY time, and if that means it takes 10ms, thats usually OK. RT guarantees that your airbag will go off 10ms after a crash every time.

      Games: you want your inputs handled ASAP, ideally <5ms, but if one or two happen after 100ms, you’ll likely not notice. If you enable RT, maybe all your inputs get handled after 10ms consistently, which ends up feeling sluggish.

      Unless you know you need RT, you probably dont actually want it.

      • thingsiplay@beehaw.org
        link
        fedilink
        arrow-up
        9
        ·
        1 month ago

        Games: you want your inputs handled ASAP, ideally <5ms, but if one or two happen after 100ms, you’ll likely not notice. If you enable RT, maybe all your inputs get handled after 10ms consistently, which ends up feeling sluggish.

        Actually I think its the other way for gaming: If you have consistent input delay, it will not feel sluggish. Same why consistent 30 fps feels better than varying 31 to 39 fps. Similar for gaming, especially if you play speedrun or 1vs1 fighting games, you would want to have consistent delay. However, if that adds too much delay its probably counterproductive. But for single player games, a consistent delay is the opposite of sluggish.

        • CameronDev@programming.dev
          link
          fedilink
          arrow-up
          4
          ·
          1 month ago

          At low numbers, it doesnt matter. If you exqgerate the numbers the effect is more clear.

          Eg. if the latency was 100ms, it would feel your movments are behind by 100ms, which would be unplayable.

          But if you had a typical latency of 10ms, with rare spikes to 1s, the spikes would be considered lag, and annoying, but most of the time its good and playable.

      • Realtime doesn’t necessarily mean low latency, it means consistent latency.

        This is such a critical distinction which can be counter-intuitive. In this case, their game may run slower, they just won’t get lags resulting from local resource contention. And even that statement has caveats.

        One of the biggest difference between self-taught developers and ones with CS degrees is that the ones with degrees usually understand a lot of important theory, such as O(1) means constant time, not necessarily fast time.

        • CameronDev@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          1 month ago

          It doesn’t help that its not well named, realtime makes it sound fast.

          One of the few things I remembered from my degree was the realtime programming course, because we got to program a model train set in Ada, on a 286(?), running on floppies. This was in ~2015, so ancient hardware even then, and it was slow, but it was “realtime”.

          Interestingly, my compsci degree never covered O notation, so that I’ve had to pick up along the way :/

          • Interestingly, my compsci degree never covered O notation, so that I’ve had to pick up along the way :/

            Really‽ That’s a shame. It’s one of the topics that, in my programming career, was regularly valuable and used. That, set theory, and discrete math have an been broadly applicable even in the most banal applications. It’s a shame if it’s not part of the CIS curriculum at some universities.