• helpImTrappedOnline@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    6 months ago

    The headline/title needs to be extended to include the rest of the sentence

    “and then sent them to a minor”

    Yes, this sicko needs to be punished. Any attempt to make him the victim of " the big bad government" is manipulative at best.

    Edit: made the quote bigger for better visibility.

  • macniel@feddit.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    Mhm I have mixed feelings about this. I know that this entire thing is fucked up but isn’t it better to have generated stuff than having actual stuff that involved actual children?

    • PM_Your_Nudes_Please@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      Yeah, it’s very similar to the “is loli porn unethical” debate. No victim, it could supposedly help reduce actual CSAM consumption, etc… But it’s icky so many people still think it should be illegal.

      There are two big differences between AI and loli though. The first is that AI would supposedly be trained with CSAM to be able to generate it. An artist can create loli porn without actually using CSAM references. The second difference is that AI is much much easier for the layman to create. It doesn’t take years of practice to be able to create passable porn. Anyone with a decent GPU can spin up a local instance, and be generating within a few hours.

      In my mind, the former difference is much more impactful than the latter. AI becoming easier to access is likely inevitable, so combatting it now is likely only delaying the inevitable. But if that AI is trained on CSAM, it is inherently unethical to use.

      Whether that makes the porn generated by it unethical by extension is still difficult to decide though, because if artists hate AI, then CSAM producers likely do too. Artists are worried AI will put them out of business, but then couldn’t the same be said about CSAM producers? If AI has the potential to run CSAM producers out of business, then it would be a net positive in the long term, even if the images being created in the short term are unethical.

      • Ookami38@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        Just a point of clarity, an AI model capable of generating csam doesn’t necessarily have to be trained on csam.

          • Ookami38@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            6 months ago

            Why is that? The whole point of generative AI is that it can combine concepts.

            You train it on the concept of a chair using only red chairs. You train it on the color red, and the color blue. With this info and some repetition, you can have it output a blue chair.

            The same applies to any other concepts. Larger, smaller, older, younger. Man, boy, woman, girl, clothed, nude, etc. You can train them each individually, gradually, and generate things that then combine these concepts.

            Obviously this is harder than just using training data of what you want. It’s slower, it takes more effort, and results are inconsistent, but they are results. And then, you curate the most viable of the images created this way to train a new and refined model.

            • Todd Bonzalez@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              6 months ago

              Yeah, there are photorealistic furry photo models, and I have yet to meet an anthropomorphic dragon IRL.

    • ImminentOrbit@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      It reminds me of the story of the young man who realized he had an attraction to underage children and didn’t want to act on it, yet there were no agencies or organizations to help him, and that it was only after crimes were committed that anyone could get help.

      I see this fake cp as only a positive for those people. That it might make it difficult to find real offenders is a terrible reason against.

    • pavnilschanda@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      A problem that I see getting brought up is that generated AI images makes it harder to notice photos of actual victims, making it harder to locate and save them

        • ricecake@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 months ago

          It does learn from real images, but it doesn’t need real images of what it’s generating to produce related content.
          As in, a network trained with no exposure to children is unlikely to be able to easily produce quality depictions of children. Without training on nudity, it’s unlikely to produce good results there as well.
          However, if it knows both concepts it can combine them readily enough, similar to how you know the concept of “bicycle” and that of “Neptune” and can readily enough imagine “Neptune riding an old fashioned bicycle around the sun while flaunting it’s tophat”.

          Under the hood, this type of AI is effectively a very sophisticated “error correction” system. It changes pixels in the image to try to “fix it” to matching the prompt, usually starting from a smear of random colors (static noise).
          That’s how it’s able to combine different concepts from a wide range of images to create things it’s never seen.

    • BrianTheeBiscuiteer@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      I have trouble with this because it’s like 90% grey area. Is it a pic of a real child but inpainted to be nude? Was it a real pic but the face was altered as well? Was it completely generated but from a model trained on CSAM? Is the perceived age of the subject near to adulthood? What if the styling makes it only near realistic (like very high quality CG)?

      I agree with what the FBI did here mainly because there could be real pictures among the fake ones. However, I feel like the first successful prosecution of this kind of stuff will be a purely moral judgement of whether or not the material “feels” wrong, and that’s no way to handle criminal misdeeds.

      • Chee_Koala@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        If not trained on CSAM or in painted but fully generated, I can’t really think of any other real legal arguments against it except for: “this could be real”. Which has real merit, but in my eyes not enough to prosecute as if it were real. Real CSAM has very different victims and abuse so it needs different sentencing.

      • forensic_potato@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        This mentality smells of “just say no” for drugs or “just don’t have sex” for abortions. This is not the ideal world and we have to find actual plans/solutions to deal with the situation. We can’t just cover our ears and hope people will stop

    • retrospectology@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      6 months ago

      The arrest is only a positive. Allowing pedophiles to create AI CP is not a victimless crime. As others point out it muddies the water for CP of real children, but it also potentially would allow pedophiles easier ways to network in the open (if the images are legal they can easily be platformed and advertised), and networking between abusers absolutely emboldens them and results in more abuse.

      As a society we should never allow the normalization of sexualizing children.

      • nexguy@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Interesting. What do you think about drawn images? Is there a limit to how will the artist can be at drawing/painting? Stick figures vs life like paintings. Interesting line to consider.

        • retrospectology@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          6 months ago

          If it was photoreal and difficult to distinguish from real photos? Yes, it’s exactly the same.

          And even if it’s not photo real, communities that form around drawn child porn are toxic and dangerous as well. Sexualizing children is something I am 100% against.

          • littlewonder@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            6 months ago

            It feels like driving these people into the dark corners of the internet is worse than allowing them to collect in clearnet spaces where drawn csam is allowed.

    • Murvel@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      6 months ago

      It feeds and evolves a disorder which in turn increases risks of real life abuse.

      But if AI generated content is to be considered illegal, so should all fictional content.

      • SigHunter@lemmy.kde.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        6 months ago

        Or, more likely, it feeds and satisfies a disorder which in turn decreases risk of real life abuse.

        Making it illegal so far helped nothing, just like with drugs

        • Murvel@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          6 months ago

          That’s not how these addictive disorders works… they’re never satisfied and always need more.

  • Kedly@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    6 months ago

    Ah yes, more bait articles rising to the top of Lemmy. The guy was arrested for grooming, he was sending these images to a minor. Outside of Digg, anyone have any suggestions for an alternative to Lemmy and Reddit? Lemmy’s moderation quality is shit, I think I’m starting to figure out where I lean on the success of my experimental stay with Lemmy

    Edit: Oh god, I actually checked digg out after posting this and the site design makes it look like you’re actually scrolling through all of the ads at the bottom of a bulshit clickbait article

    • deathbird@mander.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      It would not need to be trained on CP. It would just need to know what human bodies can look like and what sex is.

      AIs usually try not to allow certain content to be produced, but it seems people are always finding ways to work around those safeguards.

      • desktop_user@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        AIs usually try not to allow certain content to be produced, but it seems people are always finding ways to work around those safeguards.

        Local model go brrrrrr

  • TheObviousSolution@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    6 months ago

    He then allegedly communicated with a 15-year-old boy, describing his process for creating the images, and sent him several of the AI generated images of minors through Instagram direct messages. In some of the messages, Anderegg told Instagram users that he uses Telegram to distribute AI-generated CSAM. “He actively cultivated an online community of like-minded offenders—through Instagram and Telegram—in which he could show off his obscene depictions of minors and discuss with these other offenders their shared sexual interest in children,” the court records allege. “Put differently, he used these GenAI images to attract other offenders who could normalize and validate his sexual interest in children while simultaneously fueling these offenders’ interest—and his own—in seeing minors being sexually abused.”

    I think the fact that he was promoting child sexual abuse and was communicating with children and creating communities with them to distribute the content is the most damning thing, regardless of people’s take on the matter.

    Umm … That AI generated hentai on the page of the same article, though … Do the editors have any self-awareness? Reminds me of the time an admin decided the best course of action to call out CSAM was to directly link to the source.

    • Maggoty@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      Wait do you think all Hentai is CSAM?

      And sending the images to a 15 year old crosses the line no matter how he got the images.

    • Saledovil@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      Umm … That AI generated hentai on the page of the same article, though … Do the editors have any self-awareness? Reminds me of the time an admin decided the best course of action to call out CSAM was to directly link to the source.

      The image depicts mature women, not children.

      • BangCrash@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        6 months ago

        Correct. And OP’s not saying it is.

        But to place that sort of image on an article about CSAM is very poorly thought out

  • PirateJesus@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    OMG. Every other post is saying their disgusted about the images part but it’s a grey area, but he’s definitely in trouble for contacting a minor.

    Cartoon CSAM is illegal in the United States. AI images of CSAM fall into that category. It was illegal for him to make the images in the first place BEFORE he started sending them to a minor.

    https://www.thefederalcriminalattorneys.com/possession-of-lolicon

    https://en.wikipedia.org/wiki/PROTECT_Act_of_2003

    • Madison420@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      6 months ago

      Yeah that’s toothless. They decided there is no particular way to age a cartoon, they could be from another planet that simply seem younger but are in actuality older.

      It’s bunk, let them draw or generate whatever they want, totally fictional events and people are fair game and quite honestly I’d Rather they stay active doing that then get active actually abusing children.

      Outlaw shibari and I guarantee you’d have multiple serial killers btk-ing some unlucky souls.

      • ZILtoid1991@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        My main issue with generation is the ability of making it close enough to reality. Even with the more realistic art stuff, some outright referenced or even traced CSAM. The other issue is the lack of easy differentiation between reality and fiction, and it muddies the water. “I swear officer, I thought it was AI” would become the new “I swear officer, she said she was 18”.

        • Madison420@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          6 months ago

          That is not an end user issue, that’s a dev issue. Can’t train on scam if it isn’t available and as such is tacit admission of actual possession.

      • MDKAOD@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        I think the challenge with Generative AI CSAM is the question of where did training data originate? There has to be some questionable data there.

        • erwan@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          There is also the issue of determining if a given image is real or AI. If AI were legal, that means prosecution would need to prove images are real and not AI with the risk of letting go real offenders.

          The need to ban AI CSAM is even clearer than cartoon CSAM.

          • Madison420@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            edit-2
            6 months ago

            And in the process force non abusers to seek their thrill with actual abuse, good job I’m sure the next generation of children will appreciate your prudish factually inept effort. We’ve tried this with so much shit, prohibition doesn’t stop anything or just creates a black market and a abusive power system to go with it.