My background is in telecommunications (the technical side of video production), so I know that 30fps is (or was?) considered the standard for a lot of video. TV and movies don’t seem choppy when I watch them, so why does doubling the frame rate seem to matter so much when it comes to games? Reviewers mention it constantly, and I don’t understand why.

  • all-knight-party@kbin.cafe
    link
    fedilink
    arrow-up
    50
    arrow-down
    1
    ·
    10 months ago

    Because games are an interactive medium, in an action game, you’re basically responding to visual information on screen, making a judgment, and responding to it by performing input.

    The more frames that happen per second, the more information you’re able to receive in the same amount of time, which is why frames are most important in driving games, fighting games, or twitch shooters. Things happen very fast in those games, so having less frames a second puts you at a small, but very real disadvantage.

    The visual info on screen also represents your inputs since you control it. In an action game, higher FPS means you see your character responding to your inputs more quickly, which feels perceptibly better.

    You can get used to 30 FPS just fine, but certain, mostly action, games are simply better with higher FPS, whether you’re the kind of person who cares or plays competitively or not. Believe it or not even going from 60 to 120 is still a noticeable change.

    • lemillionsocks@beehaw.org
      link
      fedilink
      arrow-up
      5
      ·
      10 months ago

      Yeah movie and tv have the director keeping the things that are meant to be in focus in focus and the the lens blur can create an aesthetic and emulate an eyes field of vision. Even then sometimes if there is a lot of camera movement it isnt great.

      With games you’re in control of the camera and youre taking in a lot of visual data from all over the screen in a way that movie and tv shows dont.

  • mythosync@lemm.ee
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    1
    ·
    10 months ago

    One part is that a higher frame rate has less latency. At 30fps, every frame comes in around 33ms. At 165fps, that delay is reduced to roughly 6ms.

    The other simple answer is just the advantage of more visual data over time.

    TLDR: I like it when the game’s motion is smooth

  • Vince@feddit.de
    link
    fedilink
    arrow-up
    21
    ·
    10 months ago

    A simple website to show if it matters to you is this one (ideally check it on a screen with more than 60hz):

    https://www.testufo.com/

    Everyone’s perception is different, I’ve met someone who couldn’t tell the UFOs apart past 30 fps. They also didn’t like shooters/action games much, probably because following fast movements was difficult for them.

    But I think the vast majority of people easily notice the difference between 30 and 60. And 60 to 120 should also be possible notice for most. As for me, I used to play Quake 3 a lot and back then that was played on 120 or even 240hz CRTs. The first flat screens with slow reaction times and 60hz max were quite a step down.

    While I don’t really like linus tech tips, they did some nice testing on the topic and came to the conclusion that more fps is measurably better for shooters, even if the refresh rate can’t keep up.

    • magnetosphere @beehaw.orgOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      10 months ago

      I’ve gotten a lot of helpful answers, but yours was the only one that included a visual aid! Thanks!

      What’s interesting is that when I focused on the UFOs, I didn’t notice a difference between the 30 fps and the 60 fps stars. When I let my eyes go out of focus, though, I was able to see a clear difference between them.

      • moody@lemmings.world
        link
        fedilink
        arrow-up
        5
        ·
        10 months ago

        That’s because the middle part of your vision sees detail much better but is less reactive to movement, but the outer part of your vision sees motion much better so it can notice the stutter more easily.

        It’s also why low refresh rates are more noticeable on larger screens.

        • Max-P@lemmy.max-p.me
          link
          fedilink
          arrow-up
          3
          ·
          10 months ago

          We’re better at seeing in the center, and yet technically we have a big blind spot right in the middle. The brain just fills in the gaps.

      • PupBiru@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        10 months ago

        afaik the edges of your vision are better at picking up movement too (for “seeing out of the corner of your eye” kinda things), so it’s possible that while you’re trying to make out specific things by looking directly at them, you’re missing the parts of your eye that can make out the higher FPS?

        just a guess though

  • Max-P@lemmy.max-p.me
    link
    fedilink
    arrow-up
    18
    ·
    10 months ago

    Lots of good answers already, but I’d also add, if you have the opportunity to go to a computer store like a microcenter or anywhere they have gaming monitors on demo, try one out for a few minutes, run a first person game if you can (there’s plenty of basic demos of them on the internet through WebGL), or run testufo in a browser.

    It’s really hard to imagine the smoothness without experiencing it, and it’s why a lot of people say once they experienced it, they can’t unsee it.

    24 is all you need to create the illusion of motion, and the brain fill the gaps, but when you control the motion, especially with a high precision mouse, it really breaks the illusion. Your brain can’t fill the gaps anymore, the motion can go anywhere at any time extremely fast. Like, even just dragging windows around on the desktop you can feel the difference. I instantly know when my 144Hz monitor isn’t running at 144. It also becomes a matter of responsiveness as the others said.

    High refresh rates are also more effective at higher resolutions, because at 30fps maybe the object will need to travel 100px per frame, but at 240fps, that same object will move 25px 4 times instead. It’s probably fine when you’re watching a TV somewhat far away, but when the monitor is 32 inches and 3 feet in front of you, you notice a lot more.

    • Kale@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      10 months ago

      A decade ago I had a little extra money and chose to buy a 144 hz gaming monitor and video card. I don’t have great eyesight nor do I play games that require twitch reflexes, but at that time 144 hz frame rate (and configuring the game to be >100 fps) was very noticable. I’d much rather play 1080 at >100 fps rather than 4k at 60 fps or below.

      This may be different between people. I don’t believe I have great eyesight, depth perception, color perception, etc, but I am really sensitive to motion. I built my second computer (AMD Athlon 64 bit I think?) and spend a significant sum on a CRT that had higher refresh rates. I can’t use a CRT at 60Hz. I perceive the flicker and I get a headache after about 20 minutes. I couldn’t use Linux on that computer (I was stuck at 60 hz on that kernel/video driver) until I saved up even more to buy an LCD monitor. I can’t perceive a 60 hz flicker on an LCD, and 60Hz is fine for work.

      But for gaming, high refresh rate is noticable, even for someone that normally doesn’t notice visual stuff, like me.

  • intensely_human@lemm.ee
    link
    fedilink
    arrow-up
    16
    ·
    10 months ago

    People look at games more intently because they are actively engaged in a world through the screen with a game. A movie is just being watched, not interacted with, so the brain naturally doesn’t need or look for as much fine detail.

    Like consider how closely you would look at a flower if you’re just looking at it, versus how closely you would look at it if you were trying to trim one of its petals into a little smiley face with tiny scissors. You’re going to notice more of the flower when you’re trying to engage with it. Same thing for video games.

    • nevernevermore@kbin.social
      link
      fedilink
      arrow-up
      7
      ·
      10 months ago

      how closely you would look at it if you were trying to trim one of its petals into a little smiley face with tiny scissors

      I love this example because it sounds very specific, so I’m imagining you regularly making little smiley faces on flowers.

  • mattreb@feddit.it
    link
    fedilink
    arrow-up
    13
    ·
    edit-2
    10 months ago

    Tv and movies don’t look choppy because the shutter speed of the camera smooth out the movement with motion blur. Motion blur in games is instead just simulated and not as effective.

    Also as someone else have said a game is interactive and input latency can be as high as 3 frames, which at 30fps would be 1/10s and can be perceived…

  • mackwinston@feddit.uk
    link
    fedilink
    arrow-up
    13
    ·
    edit-2
    10 months ago

    I think 30fps (25fps in PAL-land) became the standard because televisions were 30 FPS (NTSC) or 25 FPS (PAL) due to interlacing. While the screen redraw on a NTSC television is 60 per second, it’s done as two fields so you only get 30 actual frames per second. This was done so you could have a decent resolution (525 lines for NTSC or 625 lines for PAL) while maintaining reasonable RF bandwidth limits for the TV signal by sending a single frame as two fields, half of the picture in each field on alternate TV scanlines.

    So you probably have a lot of industry inertia to deal with so 30 fps (or 25 fps where PAL was formerly the standard) ends up being the standard. And for video it’s good enough (although 60fps/50fps is still better - until fairly recently, this would entail too much bandwidth so sticking with the old NTSC or PAL frame rates made sense).

    But for computers no one really used interlaced displays because they are awful for displaying the kind of things computers usually show (the flicker is terrible with a static image in an interlaced screen mode. While it’s true there were some interlace modes, nearly everyone tried to avoid them. The resolution increase wasn’t worth the god-awful flicker). So you always had 60 Hz progressive scan on the old computer CRTs (or in the case of original VGA, IIRC it was 70 Hz). To avoid tearing, any animated content on a PC would use the vsync to stay synchronized with the CRT and this is easiest to do at the exact frequency of the CRT and provided very smooth animation, especially in fast moving scenes. Even the old 8-bit systems would run at 60 (NTSC) or 50 (PAL) FPS (although 1980s 8-bit systems were generally not doing full screen animation, usually it was just animating parts of the screen).

    So a game should always be able to hit at least 60 frames per second. If the computer or GPU is not powerful enough and the frame rate falls below 60 fps, the game can no longer use the vsync to stay synchronized with the monitor’s refresh, and you get judder and tearing.

    Virtual reality often demands more (I think the original Oculus Rift requires 90 fps) and has various tricks to ensure the video is always generated at 90 fps, and if the game can’t keep up, frames get interpolated (see “asynchronous space warp”) although if you’re using VR if you can’t hit the native frame rate, it’s generally awful having to rely on asynchronous space warp which inevitably ends up distorting some of the grpahics and adding some pretty ugly artifacts.

  • GregorGizeh@lemmy.zip
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    10 months ago

    It’s mostly noticeable when you play fast paced games where you might for example turn your field of view 180 degrees in a fracture of a second to look behind you, or when you’re for example being chased or moving into hostile territory you will do a lot of rapid camera movement to keep a look out. With low fps, the screen turns into a blurry mess during those moments, preventing you from keeping your eyes on important visual information and requiring you to get your bearings again each time you stop to look at something. The higher the fps is the more snappy and crisp are these situations and the easier it is to keep your eyes on something during movement or to execute a precise maneuver.

    Granted, for slower paced games a lower fps max be just fine, but anything hectic will be less enjoyable.

  • flashgnash@lemm.ee
    link
    fedilink
    arrow-up
    5
    ·
    10 months ago

    Try gaming on 30hz, then 60hz, then 120+ if you can and you will notice a dramatic difference

    In a film or TV show response time doesn’t matter, and you don’t really need to see things that only pop up for a split second, in a game having a higher refresh rate can mean the difference between noticing an enemy and not

  • MrFunnyMoustache@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    10 months ago

    Since you are familiar with video production, I don’t need to explain the basics of how cameras work, so I’ll jump right into the relevant part: shutter speed. Shutter speed is responsible for motion blur which makes videos appear smooth. In video games, that shutter speed is basically zero time, meaning there is no motion blur. You can replicate the effect if you shoot a 30FPS video with a much faster shutter speed, any movement will look choppy, especially faster moving objects.

    Games have the option to add motion blur, but it increases delay, making the games feel even more sluggish.

    In addition to the feeling, playing games is an interactive activity, so you need to quickly react to the stuff that goes on. The higher frame rate of the game will reduce that delay, allowing you to actually play properly. This is especially important in competitive games, or any games that require you to react quickly, or nail the timing for it. Personally, 60 is barely enough, but for good experience playing games is at least at 100 FPS with variable refresh rate technology, or 120 FPS without.

  • jsdz@lemmy.ml
    link
    fedilink
    arrow-up
    5
    arrow-down
    2
    ·
    edit-2
    10 months ago

    It comes directly from television. Early home PCs used televisions for displays, and by the 1980s TVs were generally capable of 60 fps (or 50 for regions that used PAL) so that’s what the computers generated. Everyone got used to it. And of course like everyone else said you don’t want to be adding more latency in games by not keeping up with that basic standard.

    • SwingingTheLamp@midwest.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 months ago

      Technically, NTSC video does 60 fields per second (PAL does 50), because the video signal is interlaced. That is, the beam sweeps from top to bottom of the screen 60 times per second, but it only draws half of the horizontal scan lines per sweep, alternating between the odd lines field and the even lines field. That’s why we considered “full motion video” to be 30 frames per second back in the day. The alternating fields did make movement appear smoother, but the clarity wasn’t great.

      VGA originally doubled the 15.75kHz horizontal clock rate of NTSC to 31.5kHz, so that the beam was fast enough to draw all of the lines in one vertical sweep, so it can do 60 frames per second with a 60Hz refresh rate. Prior to that, a lot of games were just 30fps, because interlaced video tended to flicker on bitmapped graphics.

      • jsdz@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        10 months ago

        VGA might’ve done that to get better resolution at 60 Hz, but I’m pretty sure earlier systems including CGA and the Amiga did 60 fps non-interlaced video at lower resolutions. At least the Amiga also had a higher-resolution interlaced video mode, but it was mostly used for displaying impressive-looking static images.

  • amio@kbin.social
    link
    fedilink
    arrow-up
    3
    ·
    10 months ago

    One thing is looks, the other thing is that input and processing may be tied to the FPS cycle. This means that games that run at e.g. 20fps could get an almost 50ms frame’s worth of delay on any of your input, and game logic may also be limited to “inbetween” frames. At 20fps that’s 50ms, at 60 it’s less than 17ms.

  • peto@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 months ago

    You can achieve the appearance of motion at lower frame rates, but when things are moving quickly the distance they move between frames can get quite high. You can cover this up with motion blur effects but it’s just not quite the same, and not so easy a process as when you are capturing real video.

    Think of the difference between when a camera does a fast pan and when you turn your head quickly. I think if you really focus in on what you are seeing you will be able to notice some difference.

    There is also a bit of that audiophile effect where lo-fi is good enough until you have got used to something better.

  • ITPaw
    link
    fedilink
    arrow-up
    3
    ·
    10 months ago

    It looks more fluid and pleasing to the eye