• brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    22 hours ago

    Yeah… and it kinda sucks because it’s small.

    If Apple shipped with 16GB/24GB like some Android phones did well before the iPhone 16, it would be far more useful. 16-24GB (aka 14B-32B class models) are the current threshold where quantized LLMs really start to feel ‘smart,’ and they could’ve continue trained a great Apache 2.0 model instead of a tiny, meager one from scratch.

    • dependencyinjection@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      22 hours ago

      I don’t know how much RAM is in my iPhone 14 Pro, but I’ve never thought ooh this is slow I need more RAM.

      Perhaps, it’ll be an issue with this stupid Apple Intelligence, but I don’t care about using that on my next upgrade cycle.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        22 hours ago

        My old Razer Phone 2 (circa 2019) shipped with 8GB RAM, and that (and the 120hz display) made it feel lighting fast until I replaced it last week, and only because the microphone got gunked up with dust.

        Your iPhone 14 Pro has 6GB of RAM. Its a great phone (I just got a 16 plus on a deal), but that will significantly shorten its longevity.

        • dependencyinjection@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          22 hours ago

          I wonder how much more efficient the RAM can be when the manufacturer makes the software and the hardware? It has to help right, I don’t know what a 16 Pro feels like compared to this, but doubt I would notice.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            22 hours ago

            Your OS uses it efficiently, but fundamentally it also limits what app developers can do. They have to make apps with 2-6GB in mind.

            Not everything needs a lot of RAM, but LLMs are absolutely an edge case where “more is better, and there’s no way around it,” and they aren’t the only one.