• WiildFiire@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      It’ll be kept within product marketing and, I dunno how, but it would absolutely be used to see what they can raise prices on

    • CeeBee@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      It’s getting there. In the next few years as hardware gets better and models get more efficient we’ll be able to run these systems entirely locally.

      I’m already doing it, but I have some higher end hardware.

        • CeeBee@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Stable diffusion SXDL Turbo model running in Automatic1111 for image generation.

          Ollama with Ollama-webui for an LLM. I like the Solar:7b model. It’s lightweight, fast, and gives really good results.

          I have some beefy hardware that I run it on, but it’s not necessary to have.