Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • FishFace@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    5
    ·
    15 hours ago

    When someone makes child porn they put a child in a sexual situation - which is something that we have amassed a pile of evidence is extremely harmful to the child.

    For all you have said - “without the consent” - “being sexualised” - “commodifies their existence” - you haven’t told us what the harm is. If you think those things are in and of themselves harmful then I need to know more about what you mean because:

    1. if someone thinks of me sexually without my consent I am not harmed
    2. if someone sexualises me in their mind I am not harmed
    3. I don’t know what the “commodification of one’s existence” can actually mean - I can’t buy or sell “the existence of women” (does buying something’s existence mean the same as buying the thing, or something else?) the same I can aluminium, and I don’t see how being able to (easily) make (realistic) nude images of someone changes this in any way

    It is genuinely incredible to me that you could be so unempathetic,

    I am not unempathetic, but I attribute the blame for what makes me feel bad about the situation is that girls are being made to feel bad and ashamed not that a particular technology is now being used in one step of that.

    • LadyAutumn@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      5 hours ago

      I am just genuinely speechless than you seemingly do not understand how sickening and invasive it is for your peers to create and share sexual content of you without your consent. Yes its extremely harmful. Its not a matter of feeling ashamed, its a matter of literally feeling like your value to the world is dictated by your role in the sexualities of heterosexual boys and men. It is feeling like your own body doesnt belong to you but can be freely claimed by others. It is losing trust in all your male friends and peers, because it feels like without you knowing they’ve already decided that you’re a sexual experience for them.

      We do know the harm of this kind of sexualization. Women and girls have been talking about it for generations. This isnt new, just a new streamlined way to spread it. It should be illegal. It should be against the law to turn someone’s images into AI generated pornography. It should also be illegal to share those images with others.

    • atomicorange@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 hours ago

      Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?

      The harm is:

      • Those photos now exist in the world and can lead to direct harm to the victim by their exposure
      • it normalizes pedophilia and creates a culture of trading images, leading to more abuse to meet demand for more images
      • The people sharing those photos learn to treat people like objects for their sexual gratification, ignoring their consent and agency. They are more likely to mistreat people they have learned to objectify.
      • your body should not be used for the profit or gratification of others without your consent. In my mind this includes taking or using your picture without your consent.