About time. This also applies to their older models such as M2 and M3 laptops.

In the U.S., the MacBook Air lineup continues to start at $999, so there is no price increase associated with the boost in RAM.

The M2 macbook air now starts at $1000 for 16GB RAM and 256GB storage. Limited storage aside, that’s surprisingly competitive with most modern Windows laptops.

  • A_Random_Idiot@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    6 days ago

    Just in time for 32gb to become the necessary standard, so they can still sell you egregiously overpriced ram upgrades.

    • thatsnothowyoudoit@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      I can’t imagine that being the case for most users. I’m absolutely a power user and I keep being surprised at how consistently high the performance is of my base model M1 Air w/16GB even when compared to another Mac workstation of mine with 64GB.

      I can run two VMs, a ton of live loading development tooling, several JVM programs and so much more on that little Air and it won’t even sweat.

      I’m not an Apple apologist - lots of poor decisions these days and software quality has taken a real hit. While 16GB means everyone’s getting a machine that should last much longer, I can’t see a normal user needing more any time soon, especially when Apple is optimizing their local machine learning models for their 8GB iOS platforms first and foremost.

  • Echo Dot@feddit.uk
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    2
    ·
    edit-2
    7 days ago

    The M2 macbook air now starts at $1000 for 16GB RAM and 256GB storage. Limited storage aside, that’s surprisingly competitive with most modern Windows laptops.

    What do you mean limited storage aside?

    If we disregard the fact that it’s terrible value for money, it’s a good deal. No laptop sold in 2025 and costing over a grand, should have anything less than a terabyte.

  • umbrella@lemmy.ml
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    6 days ago

    watch how (corporate) operating systems will now be heavier accordingly.

    • FrostyCaveman@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      6 days ago

      Yes it’s described as being for “Apple intelligence” which I’m sure won’t be bloated nor hard to disable at all… sigh

    • CaptPretentious@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 days ago

      Yep! With apples new patent pending upgrade process, it’s super easy!

      You take the device that you want to upgrade and you throw it in the garbage and then you go to an Apple store and you pick up the upgraded model! It’s so streamlined!

  • hperrin@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 days ago

    Their sales figures seem to show that the majority of people don’t care. For my needs when I’m using my MacBook, I’m one of those people who don’t care. That’s probably because it’s not my main PC, so I use it for the things most people probably use it for (browsing, watching media, some light work).

  • Mwa@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    6 days ago

    People don’t want the 8gb ram because they are all used to windows I bet it.

  • ContrarianTrail@lemm.ee
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    8
    ·
    edit-2
    7 days ago

    My daily driver MacBook Pro has 8GB of RAM, and so far, that’s been perfectly sufficient for my needs. Some might argue that 8GB is inadequate for a 1,700€ device, but I don’t think most people would notice a difference. This focus on specs might make more sense with computers, but with smartphones especially, I never understood the obsession with performance. My mid-range Samsung handles everything instantly - I can’t think of a reason it would need to be any faster. Numbers on a paper seem irrelevant when it doesn’t translate to everyday use.

    • Defaced@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      3
      ·
      7 days ago

      MacOS, no matter what anyone says, has extremely efficient memory management. It’s seriously impressive how efficient that OS truly is, and it’s no surprise they stuck with 8GB for so long. The thing these clickbait articles don’t really bring to light is that the 16GB increase is really for Apple intelligence. If that wasn’t a thing these Macs would stick to 8GB.

      • Virkkunen@fedia.io
        link
        fedilink
        arrow-up
        10
        arrow-down
        1
        ·
        7 days ago

        It is “efficient” because they just dump everything on swap. If I cold boot my M1 air, it’ll be using 7GB of RAM and 4GB of swap without anything running in the background. I have this ongoing bug as well where some background apps will stop responding and the system can’t stop the process, so it starts a new one and it keeps doing this until I either stop the app manually, or my storage is completely full because swap is taking 80GB of my internal storage.

        • Defaced@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          6 days ago

          Not sure you know what swap is…I looked at my m1 after a night of gaming on GeForce now and filling out forms on Google Chrome. My swap was at 0, my used ram was at 4GB used out of the 8GB and didn’t show down at all. I’m sorry you have had a terrible experience with your Mac, I love my Mac mini and will enjoy it as a really cool piece of tech.

    • Telorand@reddthat.com
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      1
      ·
      7 days ago

      I dunno if I’d even consider them an industry leader, unless you break down their ubiquity by industry category (in which they lead graphic design and maybe video editing, iirc). They lead phone sales in the US by a lot, but their overall desktop share is still relatively small (<10%), and their global footprint is buoyed only by iOS (which is still below Windows and Android).

      I would say they’re an innovator, and they push certain companies to innovate, but they don’t really lead by that many metrics.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    1
    ·
    edit-2
    7 days ago

    The localllama people are feeling quite mixed about this, as they’re still charging through the nose for more RAM. Like, orders of magnitude more than the bigger ICs actually cost.

    It’s kinda poetic. Apple wants to go all in on self-hosted AI now, yet their incredible RAM stinginess over the years is derailing that.

    • rebelsimile@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      7 days ago

      I do have a 64gb m1 MacBook Pro and man that thing screams at doing LLM AI. I use it to serve models locally throughout my house, while it otherwise still works as a fantastic computer (usually using about half the ram for llm usage). I still prefer a 4080 for image generation though.

  • TheTechnician27@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    2
    ·
    7 days ago

    insultingly tiny, unupgradeable storage aside, that’s surprisingly competitive with most modern Windows laptops

    • simple@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      31
      arrow-down
      7
      ·
      7 days ago

      It’s not ideal, but you’re getting probably the best hardware in the market in return. The M series still dominates Windows CPUs, and the build quality on most $1000 laptops leaves a lot to be desired.

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        33
        ·
        7 days ago

        build quality on most $1000 laptops

        You’re not kidding.

        I have a couple of laptops from various vendors, and they’re all built like shit.

        ASUS is especially eyerolly: the case is literally crumbling into pieces. Like seriously? You couldn’t have picked a material that’s not literally going to disintegrate in two years on a $1200 laptop?

        • simple@lemm.eeOP
          link
          fedilink
          English
          arrow-up
          6
          ·
          7 days ago

          Yeah, a lot of manufacturers are just bad. I knew people who had Dell and MSI laptops and those things feel like toys. Cheap plastic and very wobbly hinges. The only manufacturer I genuinely trust is Lenovo. My Legion is a bit thick but I can at least rest easy that it’s built well.

          • schizo@forum.uncomfortable.business
            link
            fedilink
            English
            arrow-up
            11
            arrow-down
            1
            ·
            7 days ago

            Lenovo is, outside of their really cheap consumer options - like, the $500-and-under options - are pretty solid.

            But yeah build quality is one reason when I roll my eyes at the ‘haha stupid buying apple! apple tax! lol ripped off!’ crowd: I mean maybe, but as soon as you pick up a Macbook whatever it’s immediately obvious that you’re getting something for what you’re paying, and not some bendy flexy piece of plastic crap that will maybe physically survive the warranty period, but not much more.

          • paraphrand@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            7 days ago

            I saw someone’s Samsung laptop last year and the screen was wobbling all over the fucking place. I couldn’t believe what I was seeing. I commented on it, and the owner just gave me a blank look.

      • yeehaw@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        7
        ·
        edit-2
        7 days ago

        The best? Debatable. You ever watch Louise on YouTube? He constantly rags on bad hardware design when repairing MacBooks lol.

  • JDPoZ@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    5
    ·
    7 days ago

    Completely laughable. Literally had 16 GB of DDR3-1600 for my 2600K from 2011 that I handed down to a kid nephew for their first PC to tinker with. Hell, my local NAS has more than that…

    • dependencyinjection@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      24
      ·
      7 days ago

      We use windows PCs at work as software engineers now, but when I was training I used a MacBook Pro M1 with 16GB of RAM and that thing was incredibly performant.

      I know it in vogue to shit in Apple, but they build the hardware and the software and they’re incredibly efficient at what they do and I don’t think I ever saw the beachball loading icon thing.

      Now the prices they charge to upgrade the RAM is something I can get behind shitting on.

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 days ago

        You can use Linux with RAM compression to have the same kind of economy that MacOS does.

        Just nobody bothers.

        • dependencyinjection@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          2
          ·
          7 days ago

          Same.

          • Mac - Fast, user friendly, and UNIX based.
          • Windows - Fast (I have a beast), bloated, stupid command prompt (“Add-Migration”, capital letters really.), wants to spy on me.
          • Linux - Fast, a lot of work to get everything working as you would on Windows or Mac and I’m past those days, I just want to turn the thing on and play Factorio or Minecraft, not figure out if my 4080 will run on it etc.

          it’s almost like people make choices to suit their needs and there isn’t a single solution for everybody.

          I wonder what the industry standard is for developers? Genuinely. I’ve heard it’s Max, but my company is all in on Microsoft, not really heard of companies developing on Linux. Which isn’t to say Linux doesn’t have its place, but I’m aware this place is insanely biased towards Linux.

          • kalleboo@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            6 days ago

            I wonder what the industry standard is for developers?

            The Stack Overflow developer survey (which has it’s bias towards people who use Stack Overflow)… says 47% use Windows, 32% use Mac, and uh, Linux is split up by distro so it’s hard to make sense of the numbers but Ubuntu alone is at 27%. (each developer can use multiple platforms so they don’t add up to 100%)

          • RecluseRamble@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            7 days ago

            I just want to turn the thing on and play Factorio or Minecraft, not figure out if my 4080 will run on it etc.

            Funny that you chose two games that run natively on Linux.

            • A_Random_Idiot@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              6 days ago

              Minecraft runs great, I dont know about factorio.

              but I know some native versions suck absolute ass and force you to use the windows version via proton regardless. ETS/ATS and Cities Skylines 1 being my immediate personal examples.

          • qqq@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 days ago

            I almost never use Windows, but aren’t commands and variables in PowerShell case insensitive?

            • dependencyinjection@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              1
              ·
              6 days ago

              Maybe it’s just the Package Manager Console inside Visual Studio Professional as “add-migration” or “update-database” don’t work unless capitalised.

          • ℍ𝕂-𝟞𝟝@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 days ago

            My current Linux machine needed exactly zero config post install, and even stuff like the fingerprint reader is working, I’m using it instead of passwords in a terminal.

            I can also play games pretty well, it’s usually smoother and less buggy than on Windows.

            I feel Linux is not a compromise for me anymore, Windows is fast becoming one though.

            • dependencyinjection@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 days ago

              What distro would you recommend, I’m prepared to try over the weekend.

              How does it work with GPU drivers for a GeForce RTX 4080?

              Anything else I need to be aware of

              • ℍ𝕂-𝟞𝟝@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                2
                ·
                6 days ago

                I’m running Fedora KDE on a Framework laptop and a custom built machine, but they are all AMD so IDK about Nvidia cards.

                As I’ve heard Nvidia nowadays releases Linux drivers.

                TBH I haven’t had any problems installing and using Linux for years now, I think just go for it and see what happens.

                • dependencyinjection@discuss.tchncs.de
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 day ago

                  So I actually did it and wiped my Windows PC, nothing on there I needed to keep.

                  Set up Fedora and added the Nvidia Drivers.

                  Shut down for a few days and in my next boot I downloaded CoolerControl. Then my networking died and I’m at a loss as to what happened.

                  And people said it was just the same as using windows, yet me a massive nerd, software developer was stuck without ever having attempted to play games.

          • OhYeah@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            6
            ·
            7 days ago

            Every place I’ve been at had developers using windows machines and then ssh into a linux environment

              • Strykker@programming.dev
                link
                fedilink
                English
                arrow-up
                2
                ·
                7 days ago

                Well enterprise software is either going to run on windows or Linux servers, so sounds like windows and Linux make good dev workstations.

                My current work gives devs macs but we build everything for Linux so it’s a bit of a nuisance. And Apple moving to arm made running vms basically impossible for a while, it’s a bit better now.

                Still a giant pain in the butt to have your dev environment not match the build environment architecture.

              • OhYeah@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                2
                ·
                7 days ago

                As a developer writing code who used windows to ssh to linux servers I would disagree. But of course it depends on the company and the nature of the work, just offering my experience

                • dependencyinjection@discuss.tchncs.de
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  7 days ago

                  What are you writing code for?

                  I literally can’t think of an example where ssh’ing into a terminal is going to give good workflow. Just using Nano or Vi?

                  Like no IDE.

      • Pasta Dental@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        edit-2
        7 days ago

        The chip and OS won’t do shit when your ram is saturated by electron apps taking 800MB each. Maybe MacOS behaves better under very high memory pressure than windows does, but it doesn’t mean it’s okay to rip off consumers. That whole 8GB on mac = 16GB on windows has been bullshit all along, and is mostly based on people looking at the task manager and seeing high ram usage on windows (which is a good thing)

        • Sh0ckw4ve@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          7 days ago

          Haha no MacOs is not better performing under very high memory pressure. Rip me working on a macbook air.

          I have to make sure not to run too many things at once…

          • Pasta Dental@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            7 days ago

            Now that I think of it yeah, my work mac simply shows a popup telling me to kill an app. It just doesn’t deal with high mem pressure lol

      • JDPoZ@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        7 days ago

        I know it’s in vogue to shit on Apple…

        Apple does have a lot of vertical integration which allows first party stuff to function well and they work closely with a lot of their premium 3rd party software partners, but you try running an actual RAM hungry process like a local LLM model, for example, and all but the highest end latest edition MacBook Pro WILL shit the bed.

    • Mercuri@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      7 days ago

      Fucking PHONES had more RAM. It was so fucking stupid. And despite their arguments, it was proven time and time again 8GB was not enough.