• RussianEngineer [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    32
    ·
    2 days ago

    DLSS CANNOT RENDER IN BLENDER

    DLSS CANNOT RENDER MY UNITY EDITOR VIEWPORT

    DLSS CANNOT ACCELERATE MY DRAWING IN SUBSTANCE PAINTER

    DLSS CANNOT RENDER MY CAD VIEWPORT

    FUCK OFF WITH YOUR GAMER AI NONSENSE SOME OF US ACTUALLY NEED THAT “BRUTE FORCE RENDERING” FOR SCIENTIFIC OR ENGINEERING OR ARTISTIC APPLICATIONS!!! FUCK!!!

    • PaX [comrade/them, they/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      2 days ago

      Approximations (AI sludge) of approximations (rasterization + lighting) of approximations (computer geometry) lmao

      Actually bourgeois mindset, the further abstracted you can become away from real work the better

      Wtf is going on? Are capitalists just that far past paying programmers to write fast algorithms instead of copy-and-pasting stock Unreal-Unity render pipelines or have we actually hit some kind of technological limit of scene complexity that they are trying to resolve?

      No one wants to do geometry anymore smh

      • RussianEngineer [she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        20
        ·
        2 days ago

        i will not stop being mad about the “AI”-powered slop enhancertm that nvidia keeps trying to push harder and harder onto everything, using it as an excuse to gimp real GPU preformance when its literally useless to everyone but gamers leaving those of us who use 3d acceleration for non-gaming tasks with sub-par hardware

      • Kumikommunism [they/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        2 days ago

        There are many legitimate reasons to be mad about reliance on temporal anti-aliasing and DLSS. Even without leaving the realm of video games. They are being made worse because of it.

      • imogen_underscore [it/its, she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        2 days ago

        not really, useful and cool technology becoming prohibitively expensive partly due to bazinga AI bullshit with niche use cases being tacked on is quite annoying. while there may be real performance arguments, a lot of people don’t like how DLSS/upscaling shit makes modern games look, so it’s arguable whether it’s even an upgrade. and ultimately it’s a niche use case for rich gamers that’s being touted as the crux of the whole product.

    • isolatedscotch
      link
      fedilink
      English
      arrow-up
      20
      ·
      2 days ago

      they probably did the thing journalists reporting on chatgpt do, where they use ai (in this case the new gpu’s capabilities for ai) to make the article and then say “ChatGPT/the new gpu wrote that”

  • makotech222 [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    33
    ·
    2 days ago

    i’m overall pretty disappointed by the 5090 announcement. Just more AI shit blurring up the screen to fake high fps. Some games its okay in and acceptable, but it also fundamentally ruins some games without extensive modifications or just outright disabling DLSS

    • RION [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      13
      ·
      2 days ago

      What games does DLSS fundamentally ruin? The only issues I’ve had with it are in games using older implementations (ex. poor denoising in Control) which is about to be fixed for all RTX cards

      • makotech222 [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        19
        ·
        2 days ago

        Dead space remake is really fucked up with dlss, super blurry and also has a bug with texture rendering when dlss is on. If you use dlsstweaks, you can force it to use dlaa which fixes most of it, though.

  • peppersky [he/him, any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    29
    ·
    2 days ago

    anyone who would even consider buying one of these cards is too damn rich for their own good. like what games would you even play on these? if you thought about buying one of these, how about giving me some of that mindless spending money instead, i’ll spend it on other useless shit like shelter and food

    • ButtBidet [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 days ago

      I have such a mid tier, $130ish graphics card and it’s fucking fine for everything I play. When (rarely) FPS is low, I play on low settings and I couldn’t care less. I enjoy the game, not the number of polygons.

      I sometimes hate watch the YouTube videos for $1000+ video cards.

      • filcuk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        2 days ago

        I said that until I upgraded and tried my first ray traced game (Control). I was flabbergasted.
        I still remember playing games on dos, so I feel like I’ve gone through all the significant game visual* quality milestones, and for me, this is one of them.
        That doesn’t mean it should be in every game or that it makes one automatically good.

  • invalidusernamelol [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    17
    ·
    2 days ago

    I can kinda see their point, as in every 2nd and 3rd frame will be an estimate from DLSS instead of actually pushing the vertexes through the render pipeline. But the use of brute force definitely does something funny painting their major selling point as brutish to sell a minor feature.