I heard about C2PA and I don’t believe for a second that it’s not going to be used for surveillance and all that other fun stuff. What’s worse is that they’re apparently trying to make it legally required. It also really annoys me when I see headlines along the lines of “Is AI the end of creativity?!1!” or “AI will help artists, not hurt them!1!!” or something to that effect. So, it got me thinking and I tried to come up with some answers that actually benefit artists and their audience rather that just you know who.

Unfortunately my train of thought keeps barreling out of control to things like, “AI should do the boring stuff, not the fun stuff” and “if people didn’t risk starvation in the first place…” So I thought I’d find out what other people think (search engines have become borderline useless haven’t they).

So what do you think would be the best way to satisfy everyone?

  • Pietson@kbin.social
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    If the answer is yes, then it simply will never take over. As long as we have some sort of law that requires art to be tagged as AI if it’s AI generated, then I think that would be enough. No need to tag original (human) art with anything, no need for that kind of surveillance, just tag AI art or make companies legally required to divulge if it is so.

    I find it hard to imagine that such a law would ever be implemented, even harder to imagine it would be enforced well, and even if both of those happen, that companies wouldn’t find ways around it like just oversaturating everything with “may contain AI generated imagery” so that the tag becomes entirely meaningless.

    adding a mandatory label doesn’t really feel like a good solution to me.