• 30_to_50_Feral_PAWGs [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    27
    ·
    9 天前

    What fucking brain genius thought a game engine would be a good replacement for Maya in anything but a blocking/proof of concept usage scenario?! UE5 on top-of-the-line hardware looks all right, but it’s not “video production” quality by a long shot.

  • MrGabr@ttrpg.network
    link
    fedilink
    English
    arrow-up
    22
    ·
    9 天前

    To everyone saying it’s a slip backwards for games, too, it’s more complicated than that. It’s absolutely possible to make a game that runs at more than 90 fps in UE5; I’ve done it in VR. The engine just makes it super easy to be lazy, and when you combine that with modern AAA “optimization is for suckers” game dev philosophy, that’s where you get performance like Borderlands 4.

    I think people only notice UE5 games running badly, and don’t realize when it’s fine. Clair Obscur was in UE5 and I never dropped below 60fps on max settings except in one area. Avowed was in UE5, probably a really early version like 5.2 or 5.3, based on when it released (the latest it could’ve been is 5.5, but it’s bad practice to switch major engine versions too far into development, so I’d doubt they updated even to 5.4). Avowed had bugs for sure, but not performance issues inherent to the engine.

    I think blaming UE5 lets lazy development practices off easy. I’ll take it over Unity for sure (I’ve experienced Unity fail at basic vector math, let alone that no one should ever trust them again after that per-install fee stunt). We should be maintaining that same frustration at developers for not optimizing. Lumen was not ready when it came out, and Nanite requires a minimum hardware spec that’s still absurd, but it’s literally two switches to flip in project settings to turn those off. UE5 is really an incredible piece of technology and it has made, and continues to make, game making accessible on a scale comparable to when Unity added a free license. AAA developers get off easy when you blame the engine instead of their garbage code.

    ~Godot is a beautiful perfect angel that needs a new 3D physics engine~

    • gaycomputeruser [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      9
      ·
      9 天前

      The problem isn’t just the performance, but UE5 doesn’t look very good - especially given the amount of hardware that’s needed. Some of the biggest problems in my mind are the bluryness of the image (apparently due to lots of temporal techniques) and the UE5 lighting which gives the games a very distinct and unrealistic look, when compared to other engines. Further, the vast majority of skin in UE is terrible.

      • MrGabr@ttrpg.network
        link
        fedilink
        English
        arrow-up
        7
        ·
        8 天前

        That’s fair, and you really see that on games like Norse where they don’t have the resources to make custom material and post-processing shaders, but they still want it to look like AAA photorealism (a bad strategy to begin with but that’s their problem). Out of the box, though, UE5 still looks leagues better than anything else that isn’t proprietary, and I’d argue that if you do have the time/staff to dedicate an entire team to technical art, the ceiling of how good UE5 can look, if you’re going for photorealism, is higher than it is for Unity and Godot as well.

        To the original context of the post, that ceiling is still way lower than what should be acceptable quality for big-budget movie CGI, but regarding games, I’m gonna stick to my original point and say that’s still an issue on the developers’ part for not putting in the effort to make it look good. Even accounting for optimization and visual tweaking, they’re still saving enormous amounts of time and money by using UE5 instead of their own engine, and that effort should be expected, the lack thereof not excused.

        • gaycomputeruser [she/her]@hexbear.net
          link
          fedilink
          English
          arrow-up
          5
          ·
          8 天前

          That’s a very fair point on the graphical quality! From my pov it seems like epic should needs to make it easier for developers to optimize their projects, given the number of games that haven’t had a lot of that work done. I’m sure that’s easier said than done though.

          It really is strange to me that ue5 is being used for games given there are other raster rendering engines that are designed for better image quality. I’m assuming here that part of the benefit the studios are looking for is the speed increase from not having to prerender scenes on larger server farms, and the flexibilty they get from systems like Disney’s “the volume” system.

          • MrGabr@ttrpg.network
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 天前

            AFAIK, the speed increase to allow technology like the volume is the whole pitch. Not every studio has an entire volume, so lower-budget filmmakers can set up a system with a green screen where the cinematographer can see the CGI environment in real-time through the camera, and with the asset store integration, indie filmmakers can have an insane set/backdrop for a tiny fraction of the normal price.

            Now that I think of it, though, I think Mr. Verbinski here is placing undue blame on UE5 when Marvel’s CGI has been getting worse and worse because they throw an army of slaves at the footage after the fact, rather than paying artists and working with them to set up shots to make the CGI as easy as possible, like he did.

    • JakenVeina@midwest.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 天前

      My example would be Satisfactory. That game ran GREAT for years on my freakin’ 10-year-old 1070. It was only in 2025 that I started having some minor framerate issues in areas with a whole lot of cosmetics and machinery (which is inevitable in factory/automation genre, where the game really can’t control how much players will ask it to render). And then I had the SAME kinds of issues after upgrading to a 3080, until I switched to Linux, so really the 1070 might never have been the issue, anyway.

  • Awoo [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    9 天前

    Greatest slip backwards for games too, I will not get above 60fps in most games using it without framegen. I’m probably misplacing blame on Unreal for this though, AI framegen existing caused this as it made devs lazier.

  • Grerkol@leminal.space
    link
    fedilink
    English
    arrow-up
    6
    ·
    8 天前

    I know very little about CGI, so sorry if this is dumb I guess…

    But why would they even consider using a game engine in the first place instead of a program like Maya or Blender? Is it just a bit easier to use for simple things or something? Surely everyone who works for the studios is already used to using software that’s actually specifically for 3D modeling/animation. Also surely Maya/Blender will always give significantly higher quality renders anyway since it doesn’t have to render in real time like a game would… just why?

    • Vritrahan@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      ·
      8 天前

      Like he said, shortcuts. You have to make everything by hand in Maya, including the lighting.

    • novibe@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 天前

      Rendering in real time is, well, real time.

      It takes dozens of hours to render seconds of some CGI movies.

      It’s just cheaper in time and literal energy costs to use game engines that live render everything.

  • MrGabr@ttrpg.network
    link
    fedilink
    English
    arrow-up
    2
    ·
    9 天前

    To everyone saying it’s a slip backwards for games, too, it’s more complicated than that. It’s absolutely possible to make a game that runs at more than 90 fps in UE5; I’ve done it in VR. The engine just makes it super easy to be lazy, and when you combine that with modern AAA “optimization is for suckers” game dev philosophy, that’s where you get performance like Borderlands 4.

    I think people only notice UE5 games running badly, and don’t realize when it’s fine. Clair Obscur was in UE5 and I never dropped below 60fps on max settings except in one area. Avowed was in UE5, probably a really early version like 5.2 or 5.3, based on when it released (the latest it could’ve been is 5.5, but it’s bad practice to switch major engine versions too far into development, so I’d doubt they updated even to 5.4). Avowed had bugs for sure, but not performance issues inherent to the engine.

    I think blaming UE5 lets lazy development practices off easy. I’ll take it over Unity for sure (I’ve experienced Unity fail at basic vector math, let alone that no one should ever trust them again after that per-install fee stunt). We should be maintaining that same frustration at developers for not optimizing. Lumen was not ready when it came out, and Nanite requires a minimum hardware spec that’s still absurd, but it’s literally two switches to flip in project settings to turn those off. UE5 is really an incredible piece of technology and it has made, and continues to make, game making accessible on a scale comparable to when Unity added a free license. AAA developers get off easy when you blame the engine instead of their garbage code.

    ~Godot is a beautiful perfect angel that needs a new 3D physics engine~