An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white with lighter skin and blue eyes.::Rona Wang, a 24-year-old MIT student, was experimenting with the AI image creator Playground AI to create a professional LinkedIn photo.

  • Dojan@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    You could get a more controlled result if you use inpainting. I resized the image in an image editor first, because it gave me really strange results when I gave it the strange resolution of the image from the article. I masked out her face and had the model go ham on the rest of the image, while leaving the face and the hairline untouched.

    photograph of a professional asian woman, linkedin profile picture

    After that I removed the mask, changed the denoising strength to 0.2, applied the “face fix”, and this is the end result.

    It’s usable, but I think it’s a weird use-case for these kinds of tools. Like yeah you can retouch photos if you’re careful enough, but at the same time, wouldn’t it be easier to just dress up and take some new photos? I dunno, the idea of using an AI generated profile image feels kind of sketchy to me, but then again we have had beautification filters available for ages - my work phone, a Google Pixel 6, comes with it built into the camera application. Every time the camera opens on my face I get this weird uncanny feeling.

    Anyway. The article does touch upon a problem that definitely worries me too

    She added that “racial bias is a recurring issue in AI tools” […] Wang told The Globe that she was worried about the consequences in a more serious situation, like if a company used AI to select the most “professional” candidate for the job and it picked white-looking people.

    I really hope no company would use an image model to analyse candidates profile photos, because as an ugly person that makes me want to scream. However, this has been a problem in the past, Amazon developed a tool for use by recruiters, which turned out to have a bias against women. I can easily see a “CV analysis tool” having a bias against people with names of non-European origin for example.

    At this point I think it’s impossible to put the genie back into the bottle, given the chance I definitely would, but I think all we can do now is ensure that we try and mitigate potential harm caused by these tools as much as possible.