• Dizzy Devil Ducky@lemm.ee
    link
    fedilink
    English
    arrow-up
    12
    ·
    7 months ago

    By opt-in I will immediately presume they mean they will let you know that they’re doing it after you opt-in. I imagine it’s gonna be something they still secretly do without your permission because fuck you peasant.

    • meseek #2982@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      Yes this. I don’t understand how anyone trusts Microsoft here (or anywhere). Their lies are legendary.

  • Lojcs@lemm.ee
    link
    fedilink
    arrow-up
    11
    ·
    7 months ago

    Inb4 windows nags you about it on every window and “critical os repair” silently enables it

  • shyguyblue@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    7 months ago

    They still have the tech built in to the OS. What are the chances of someone making a virus that forwards the telemetry somewhere else, if that’s even possible.

    Using Windows 11 on my laptop finally pushed me to wipe the disk and go full Linux, no duel boot. The point is, it still has the ability.

    • Gamers_Mate@kbin.run
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      For me it was microsoft wanting to add copilot to the OS level. Recall is a mess and even if a hacker cannot forward the telemetry somewhere else what is preventing a bad actor from opting in without the hardware owners permission?

      • floofloof@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        Or you could have malware that just asks Copilot about what you’ve been doing and exfiltrates data that way.

      • shyguyblue@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        7 months ago

        As an app you have to launch and interact with, and there’s usually a sound or alert that tells you a screenshot has been taken. With M$, it’s doing this without input or notification.

        • Lojcs@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          No? A program doesn’t have to use the high level tools you have to for taking a screenshot, it can do so silently. I myself made a python script years ago that did this. Not even all user-facing functions have feedback for it neither, just press print screen

  • ssm@lemmy.sdf.org
    link
    fedilink
    arrow-up
    3
    ·
    7 months ago

    Fuck. I was hoping they would just go forward with it anyways so it would drive some people over to open source ecosystems. Maybe they’ll make it opt-out again after a couple months and the outrage dies down, as these companies tend to do these days.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    This is the best summary I could come up with:


    Microsoft had originally planned to turn Recall on by default, but the company now says it will offer the ability to disable the controversial AI-powered feature during the setup process of new Copilot Plus PCs.

    Recall uses local AI models to screenshot mostly everything you see or do on your computer and then give you the ability to search and retrieve anything in seconds.

    Everything in Recall is designed to remain local and private on-device, so no data is used to train Microsoft’s AI models.

    TotalRecall extracts the Recall database so you can easily view what text is stored and the screenshots that Microsoft’s feature has generated.

    “In some cases, this will mean prioritizing security above other things we do, such as releasing new features or providing ongoing support for legacy systems.”

    Davuluri references Microsoft’s SFI principles in today’s response, noting that the company is taking action to improve Recall security.


    The original article contains 747 words, the summary contains 151 words. Saved 80%. I’m a bot and I’m open source!