• Liz@midwest.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 days ago

        Sure, and in fact some developers used the fuzziness to their advantage, which can make certain games look weird when you display them on anything modern. But, my point was more that some people are in here acting like every part of a CRT experience is better than flatscreens.

        • RememberTheApollo_@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          9 days ago

          They were good for the time, and they still do offer some benefits, but those benefits are overshadowed by the advantages of modern gaming LCDs. I wouldn’t want one today if I had to pick.

    • r00ty@kbin.life
      link
      fedilink
      arrow-up
      1
      ·
      9 days ago

      Are you sure it was CRT technology? Because bear in mind, colour CRTs had to focus the beam so accurately that it only hit the specific “pixel” for the colour being lit at that time. What there was, was blur from bad focus settings, age and phosphor persistence (which is still a thing in LCD to an extent).

      What DID cause blur was the act of merging the image, the colour and the synchronisation into a composite signal. All the mainstream systems (PAL, SECAM and NTSC) would cause a blurring effect. Games on 80s/90s consoles generally used this to their advantage, and you can see the dithering effects clearly on emulators of systems from that period. Very specifically, the colour signal sharing spectrum with the luminance signal would lead to a softening of the image which would appear like blurring. Most consoles from the time only output either an RF signal for a TV or if you were lucky a composite output.

      Good computer monitors (not TVs) of the time were extremely crisp when fed a suitable RGB signal.