About time. This also applies to their older models such as M2 and M3 laptops.

In the U.S., the MacBook Air lineup continues to start at $999, so there is no price increase associated with the boost in RAM.

The M2 macbook air now starts at $1000 for 16GB RAM and 256GB storage. Limited storage aside, that’s surprisingly competitive with most modern Windows laptops.

  • dependencyinjection
    link
    fedilink
    English
    arrow-up
    24
    ·
    2 months ago

    We use windows PCs at work as software engineers now, but when I was training I used a MacBook Pro M1 with 16GB of RAM and that thing was incredibly performant.

    I know it in vogue to shit in Apple, but they build the hardware and the software and they’re incredibly efficient at what they do and I don’t think I ever saw the beachball loading icon thing.

    Now the prices they charge to upgrade the RAM is something I can get behind shitting on.

      • dependencyinjection
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        2 months ago

        Same.

        • Mac - Fast, user friendly, and UNIX based.
        • Windows - Fast (I have a beast), bloated, stupid command prompt (“Add-Migration”, capital letters really.), wants to spy on me.
        • Linux - Fast, a lot of work to get everything working as you would on Windows or Mac and I’m past those days, I just want to turn the thing on and play Factorio or Minecraft, not figure out if my 4080 will run on it etc.

        it’s almost like people make choices to suit their needs and there isn’t a single solution for everybody.

        I wonder what the industry standard is for developers? Genuinely. I’ve heard it’s Max, but my company is all in on Microsoft, not really heard of companies developing on Linux. Which isn’t to say Linux doesn’t have its place, but I’m aware this place is insanely biased towards Linux.

        • OhYeah@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          6
          ·
          2 months ago

          Every place I’ve been at had developers using windows machines and then ssh into a linux environment

          • dependencyinjection
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            2 months ago

            Makes sense for sysadmin or something but little sense for developers and engineers writing code to build enterprise software.

            • OhYeah@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              As a developer writing code who used windows to ssh to linux servers I would disagree. But of course it depends on the company and the nature of the work, just offering my experience

              • dependencyinjection
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 months ago

                What are you writing code for?

                I literally can’t think of an example where ssh’ing into a terminal is going to give good workflow. Just using Nano or Vi?

                Like no IDE.

            • Strykker@programming.dev
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              Well enterprise software is either going to run on windows or Linux servers, so sounds like windows and Linux make good dev workstations.

              My current work gives devs macs but we build everything for Linux so it’s a bit of a nuisance. And Apple moving to arm made running vms basically impossible for a while, it’s a bit better now.

              Still a giant pain in the butt to have your dev environment not match the build environment architecture.

        • kalleboo@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          2 months ago

          I wonder what the industry standard is for developers?

          The Stack Overflow developer survey (which has it’s bias towards people who use Stack Overflow)… says 47% use Windows, 32% use Mac, and uh, Linux is split up by distro so it’s hard to make sense of the numbers but Ubuntu alone is at 27%. (each developer can use multiple platforms so they don’t add up to 100%)

        • RecluseRamble@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          2 months ago

          I just want to turn the thing on and play Factorio or Minecraft, not figure out if my 4080 will run on it etc.

          Funny that you chose two games that run natively on Linux.

          • A_Random_Idiot@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            Minecraft runs great, I dont know about factorio.

            but I know some native versions suck absolute ass and force you to use the windows version via proton regardless. ETS/ATS and Cities Skylines 1 being my immediate personal examples.

        • qqq@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          I almost never use Windows, but aren’t commands and variables in PowerShell case insensitive?

          • dependencyinjection
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            Maybe it’s just the Package Manager Console inside Visual Studio Professional as “add-migration” or “update-database” don’t work unless capitalised.

        • ℍ𝕂-𝟞𝟝@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          My current Linux machine needed exactly zero config post install, and even stuff like the fingerprint reader is working, I’m using it instead of passwords in a terminal.

          I can also play games pretty well, it’s usually smoother and less buggy than on Windows.

          I feel Linux is not a compromise for me anymore, Windows is fast becoming one though.

          • dependencyinjection
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            What distro would you recommend, I’m prepared to try over the weekend.

            How does it work with GPU drivers for a GeForce RTX 4080?

            Anything else I need to be aware of

            • ℍ𝕂-𝟞𝟝@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              I’m running Fedora KDE on a Framework laptop and a custom built machine, but they are all AMD so IDK about Nvidia cards.

              As I’ve heard Nvidia nowadays releases Linux drivers.

              TBH I haven’t had any problems installing and using Linux for years now, I think just go for it and see what happens.

              • dependencyinjection
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 months ago

                So I actually did it and wiped my Windows PC, nothing on there I needed to keep.

                Set up Fedora and added the Nvidia Drivers.

                Shut down for a few days and in my next boot I downloaded CoolerControl. Then my networking died and I’m at a loss as to what happened.

                And people said it was just the same as using windows, yet me a massive nerd, software developer was stuck without ever having attempted to play games.

    • Pasta Dental@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      edit-2
      2 months ago

      The chip and OS won’t do shit when your ram is saturated by electron apps taking 800MB each. Maybe MacOS behaves better under very high memory pressure than windows does, but it doesn’t mean it’s okay to rip off consumers. That whole 8GB on mac = 16GB on windows has been bullshit all along, and is mostly based on people looking at the task manager and seeing high ram usage on windows (which is a good thing)

      • Sh0ckw4ve@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 months ago

        Haha no MacOs is not better performing under very high memory pressure. Rip me working on a macbook air.

        I have to make sure not to run too many things at once…

        • Pasta Dental@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 months ago

          Now that I think of it yeah, my work mac simply shows a popup telling me to kill an app. It just doesn’t deal with high mem pressure lol

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      You can use Linux with RAM compression to have the same kind of economy that MacOS does.

      Just nobody bothers.

    • HeyListenWatchOut@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      2 months ago

      I know it’s in vogue to shit on Apple…

      Apple does have a lot of vertical integration which allows first party stuff to function well and they work closely with a lot of their premium 3rd party software partners, but you try running an actual RAM hungry process like a local LLM model, for example, and all but the highest end latest edition MacBook Pro WILL shit the bed.