• 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 @pawb.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 hours ago

    Motion Blur and depth of field has almost no impact on performance. Same with Anisotropic Filtering and I can not understand why AF isn’t always just defaulted to max, since even back in the golden age of gaming it had no real performance impact on any system.

    • sp3ctr4l@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      58 minutes ago

      You either haven’t been playing PC games very long, or aren’t that old, or have only ever played on fairly high end hardware.

      Anisotropic filtering?

      Yes, that… hasn’t been challenging for an affordable PC an average person has to run at 8x or 16x for … about a decade. That doesn’t cause too much framerate drop off at all now, and wasn’t too much until you… go all the way back to the mid 90s to maybe early 2000s, when ‘GPUs’ were fairly uncommon.

      But that just isn’t true for motion blur and DoF, especially going back further than 10 years.

      Even right now, running CP77 on my steam deck, AF level has basically no impact on my framerate, whereas motion blur and DoF do have a noticable impact.

      Go back even further, and a whole lot of motion blur/DoF algorithms were very poorly implemented by a lot of games. Nowadays we pretty much get the versions of those that were not ruinously inefficient.

      Try running something like Arma 2 with a mid or low range PC with motion blur on vs off. You could get maybe 5 to 10 more fps having it off… and thats a big deal when you’re maxing out at 30 to 40ish fps.

      (Of course now we also get ghosting and smearing from framegen algos that ironically somewhat resemble some forms of motion blur.)

      • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 @pawb.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 hour ago

        I am 40 and have been gaming on PC my entire life.

        Try running something like Arma 2 with a mid or low range PC with motion blur on vs off. You could get maybe 5 to 10 more fps having it off… and thats a big deal when you’re maxing out at 30 to 40ish fps.

        Arma is a horrible example, since it is so poorly optimized, you actually get a higher frame rate maxing everything out compared to running everything on low. lol

        • sp3ctr4l@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          46 minutes ago

          If you’re 40 and have been PC gaming your whole life, then I’m going with you’ve had fairly high end hardware, and are just misremembering.

          Arma 2 is unoptimized in general… but largely thats because it basically uses a massive analog to a pagefile on your HDD because of how it handles its huge environments in engine. Its too much to jam through 32 bit OSs and RAM.

          When SSDs came out, that turned out to be the main thing that’ll boost your FPS in older Arma games, because they have much, much faster read/write speeds.

          … But, their motion blur is still unoptimized and very unperformant.

          As for setting everything to high and getting higher FPS… thats largely a myth.

          There are a few postprocessing settings that work that way, and thats because in those instances, the ‘ultra’ settings actually are different algorithms/methods, that are both less expensive and visually superior.

          It is still the case that if you set texture, model quality to low, grass/tree/whatever draw distances very short, you’ll get more frames than with those things maxxed out.