Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • LadyAutumn@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    9
    ·
    19 hours ago

    It’s sexually objectifying the bodies of girls and turning them into shared sexual fantasies their male peers are engaging in. It is ABSOLUTELY different because it is more realistic. We are talking about entire deep fake porngraphy production and distribution groups IN THEIR OWN SCHOOLS. The amount of teenage boys cutting pictures out and photoshopping them was nowhere near as common as this is fast becoming and it was NOT the same as seeing a naked body algorithmically derived to appear as realistic as possible.

    Can you stop trying to find a silver lining in the sexual exploitation of teenage girls? You clearly don’t understand the kinds of long term psychological harm that is caused by being exploited in this way. It was also exploitative and also fucked up when it was in photoshop, this many orders of magnitude more sophisticated and accessible.

    Youre also wrong that this is about bullying. Its an introduction to girls being tools for male sexual gratification. It’s LITERALLY commodifiying teenage girls as sexual experiences and then sharing them in groups together. It’s criminal. The consent of the individual has been entirely erased. Dehumanization in its most direct form. It should be against the law and it should be prosecuted very seriously wherever it is found to occur.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      9
      ·
      18 hours ago

      Can you stop trying to find a silver lining in the sexual exploitation of teenage girls?

      Can you please use words by their meaning?

      Also I’ll have to be blunt, but - every human has their own sexuality, with their own level of “drive”, so to say, and their dreams.

      And it’s absolutely normal to dream of other people. Including sexually. Including those who don’t like you. Not only men do that, too. There are no thought crimes.

      So talking about that being easier or harder you are not making any argument at all.

      However. As I said elsewhere, the actions that really harm people should be classified legally and addressed. Like sharing such stuff. But not as making child pornography because it’s not, and not like sexual exploitation because it’s not.

      It’s just that your few posts I’ve seen in this thread seem to say that certain kinds of thought should be illegal, and that’s absolute bullshit. And laws shouldn’t be made based on such emotions.

      • atomicorange@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        3
        ·
        18 hours ago

        I don’t know where you’re getting this “thought crime” stuff. They’re talking about boys distributing deepfake nudes of their classmates. They’re not talking about individuals fantasizing in the privacy of their own homes. You have to read all of the words in the sentences, my friend.

    • FishFace@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      6
      ·
      15 hours ago

      If a boy fantasises sexually about a girl, is that harmful to her? If he tells his friends about it? No, this is not harmful - these actions do not affect her in any way. What affects the girl is how the boys might then treat her differently than they would do someone they don’t find sexually attractive.

      The solution, in both cases, has to be to address the harmful behaviour. The only arguments for criminalising deepfakes themselves are also arguments for criminalising sexual fantasies. that is why people are talking about thought crime, because once you criminalise things that are harmless on their own, but which might down the line lead to directly harmful behaviour, there is no other distinction.

      The consent of the individual has been entirely erased. Dehumanization in its most direct form.

      Both of these, for example, apply just as readily to discussing a shared sexual fantasy about someone who didn’t agree to it.

      No distinction, that is, other than this is new and icky. I don’t want government policy to be dictated by fear of the new and by what people find icky, though. I do lots of stuff people find icky.

      • LadyAutumn@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        3
        ·
        edit-2
        15 hours ago

        No an image that is shared and distributed is not the same as a fantasy in someone’s head. That is deranged. Should CSAM also be legal because making it illegal is like criminalizing the fantasies of pedophiles? Absolutely insane logical framework you have there.

        This isnt fantasy. It is content. It is media. It is material. It is produced without the consent of the girls and women being sexualized and it commodifies their existence, literally transforming the idea of them into sexual media consumed for the gratification of boys and men.

        It is genuinely incredible to me that you could be so unempathetic, so impassive, so detached from the real world and the consequences of this, that you could even make this comparison. You have seemingly no idea what youre talking about if you believe that pornography is the same thing as mental fantasies.

        And even in the case of mental fantasies, are those all good? Is it really a good thing that boys see the mere existence of the girls around them as inherently some kind of sexual availability?

        • FishFace@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          5
          ·
          15 hours ago

          When someone makes child porn they put a child in a sexual situation - which is something that we have amassed a pile of evidence is extremely harmful to the child.

          For all you have said - “without the consent” - “being sexualised” - “commodifies their existence” - you haven’t told us what the harm is. If you think those things are in and of themselves harmful then I need to know more about what you mean because:

          1. if someone thinks of me sexually without my consent I am not harmed
          2. if someone sexualises me in their mind I am not harmed
          3. I don’t know what the “commodification of one’s existence” can actually mean - I can’t buy or sell “the existence of women” (does buying something’s existence mean the same as buying the thing, or something else?) the same I can aluminium, and I don’t see how being able to (easily) make (realistic) nude images of someone changes this in any way

          It is genuinely incredible to me that you could be so unempathetic,

          I am not unempathetic, but I attribute the blame for what makes me feel bad about the situation is that girls are being made to feel bad and ashamed not that a particular technology is now being used in one step of that.

          • LadyAutumn@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            5 hours ago

            I am just genuinely speechless than you seemingly do not understand how sickening and invasive it is for your peers to create and share sexual content of you without your consent. Yes its extremely harmful. Its not a matter of feeling ashamed, its a matter of literally feeling like your value to the world is dictated by your role in the sexualities of heterosexual boys and men. It is feeling like your own body doesnt belong to you but can be freely claimed by others. It is losing trust in all your male friends and peers, because it feels like without you knowing they’ve already decided that you’re a sexual experience for them.

            We do know the harm of this kind of sexualization. Women and girls have been talking about it for generations. This isnt new, just a new streamlined way to spread it. It should be illegal. It should be against the law to turn someone’s images into AI generated pornography. It should also be illegal to share those images with others.

          • atomicorange@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            10 hours ago

            Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?

            The harm is:

            • Those photos now exist in the world and can lead to direct harm to the victim by their exposure
            • it normalizes pedophilia and creates a culture of trading images, leading to more abuse to meet demand for more images
            • The people sharing those photos learn to treat people like objects for their sexual gratification, ignoring their consent and agency. They are more likely to mistreat people they have learned to objectify.
            • your body should not be used for the profit or gratification of others without your consent. In my mind this includes taking or using your picture without your consent.