Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • BombOmOm@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    12
    ·
    edit-2
    14 hours ago

    Taking secret nude pictures of someone is quite a bit different than…not taking nude pictures of them.

    It’s not CSAM to put a picture of someone’s face on an adult model and show it to your friend. It’s certainly sexual harassment, but it isn’t CSAM.

    • atomicorange@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      14 hours ago

      How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

      • BombOmOm@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        7
        ·
        edit-2
        14 hours ago

        It’s absolutely sexual harassment.

        But, to your question: you can’t just say something has underage nudity when the nudity is of an adult model. It’s not CSAM.

        • atomicorange@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          4
          ·
          14 hours ago

          Yes, it’s sexual abuse of a child, the same way taking surreptitious locker room photos would be. There’s nothing magical about a photograph of real skin vs a fake. The impact to the victim is the same. The impact to the viewer of the image is the same. Arguing over the semantic definition of “abuse” is getting people tangled up here. If we used the older term, “child porn” people wouldn’t be so hesitant to call this what it is.