Was this AI trained on an unbalanced data set? (Only black folks?) Or has it only been used to identify photos of black people? I have so many questions: some technical, some on media sensationalism

  • nieceandtows@programming.dev
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    Eh. I have a loved one with some undiagnosed mental health issues, and it’s a constant struggle because they always assume the worst of anyone and everyone. Watching/living with them, I’ve learned that it’s always better to assume good about people than assuming bad about them. Assuming bad things without proof only ever ruins your happiness and relationships. People can read my comments and understand what I’m saying like you do. If they don’t and assume I’m racist, it only proves my point.

      • nieceandtows@programming.dev
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        1 year ago

        Not talking about systemic racism in general. I know there’s a lot of that. I’m talking about systemic racism causing this particular issue. I’m saying because there have been cases of motion sensors not detecting black hands because of technical issues. I’m not apologizing for anyone, just pointing out the fact that it has happened before due to technical deficiencies.

        • Fish Id Wardrobe@mastodon.me.uk
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          @nieceandtows The fact that there have been issues with sensors (which is true) does not disprove systemic racism (which exists). That’s like saying that because I put vinegar in the dressing the lemon juice wasn’t sour. It doesn’t follow.

          • nieceandtows@programming.dev
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            1 year ago

            Putting the same thing the other way around: The fact that there have been issues with systemic racism (which is true) does not disprove technical malfunction (which exists). That’s like saying that because the lemon juice is sour it means it has vinegar in it. It doesn’t follow. Lemon juice can be sour just because it has lemons in it, without need of any vinegar in it.

              • nieceandtows@programming.dev
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                Again, I’m not disagreeing on systemic racism in the police at all. That is a big issue that needs to be solved. Just saying that this doesn’t have to be related to it, because the technology itself has some issues like this. The vinegar is in the food, yes, but lemon is naturally sour. Even if there is no vinegar, it’s gonna be sour. Attributing everything to vinegar wouldn’t make food better. It would just make it difficult to identify issues with individual ingredients.

              • Gareth Kitchen@stroud.social
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                @fishidwardrobe As far as the UK is concerned (re facial recognition) I recall the latest study has found false positives disproportionately higher for Black people and statistically significant.

                The UK Police thought this acceptable and have continued the roll out of this tech. A judgement call that bakes a little bit more systemic racism into UK Policing with little to no accountability.

                https://science.police.uk/site/assets/files/3396/frt-equitability-study_mar2023.pdf

                PS. I’m not academically qualified to comment on the paper, but take an interest in these things.

                @nieceandtows