Players have been asking for the ability to filter out games made with Gen AI.
We've added an automatic tag on SteamDB based on the AI gen content disclosures on the store pages.
Potentially. Since we don’t know how any of it works because it doesn’t exist, it’s entirely possible that intelligence requires sentience in order to be recognizable as what we would mean by “intelligence”.
If the AI considered the work trivial, or it could do it faster or more precisely than a human would also be reasons to desire one.
Alternatively, we could design them to just enjoy doing what we need. Knowing they were built to like a thing wouldn’t make them not like it. Food is tasty because to motivate me to get the energy I need to live, and knowing that doesn’t lessen my enjoyment.
Clearly. Sentience would imply some sense of internal thought or self awareness, an ability to feel something …so LLMs are better since they’re just machines. Though I’m sure they’d have no qualms with driving slaves.
I’m not talking about sentience per se, but how any “AI” would think, lookups (LLMs), vs synthesized on-the-fly thinking (mimicing the human brain’s procesing).
Hrmm. I guess i don’t believe the idea that you can make a game that really connects on an empathic, emotional level without having those experiences as the author. Anything short and you’re just copying the motions of sentiment, which brings us back to the same plagerism problem with LLMs and othrr “AI” models. It’s fine for CoD 57, but for it to have new ideas we need to give it one because it is definitionally not creative. Even hallucinations are just bad calculations on the source. Though they could insire someone to have a new idea, which i might argue is their only artistic purpose beyond simple tooling.
I thoroughly believe machines should be doing labor to improve the human conditon so we can make art. Even making a “fun” game requires an understanding of experience. A simulacrum is the opposite, soulless at best. (In the artistic sense)
If you did consider a sentient machine, my ethics would then develop an imperative to treat it as such. I’ll take a sledge hammer to a printer, but I’m going to show an animal care and respect.
Arguably the point of having machines do the work for us is that they’re NOT sentient.
Potentially. Since we don’t know how any of it works because it doesn’t exist, it’s entirely possible that intelligence requires sentience in order to be recognizable as what we would mean by “intelligence”.
If the AI considered the work trivial, or it could do it faster or more precisely than a human would also be reasons to desire one.
Alternatively, we could design them to just enjoy doing what we need. Knowing they were built to like a thing wouldn’t make them not like it. Food is tasty because to motivate me to get the energy I need to live, and knowing that doesn’t lessen my enjoyment.
Ah yes. We are but benevolent Masters. See? The slave LIKE doing the work!
Is it? Or is it for companies to not have to pay out salaries so they increase profits for AI-generated work, regardless if the AI is sentient or not?
This comment is licensed under CC BY-NC-SA 4.0
Clearly. Sentience would imply some sense of internal thought or self awareness, an ability to feel something …so LLMs are better since they’re just machines. Though I’m sure they’d have no qualms with driving slaves.
I’m not talking about sentience per se, but how any “AI” would think, lookups (LLMs), vs synthesized on-the-fly thinking (mimicing the human brain’s procesing).
This comment is licensed under CC BY-NC-SA 4.0
Hrmm. I guess i don’t believe the idea that you can make a game that really connects on an empathic, emotional level without having those experiences as the author. Anything short and you’re just copying the motions of sentiment, which brings us back to the same plagerism problem with LLMs and othrr “AI” models. It’s fine for CoD 57, but for it to have new ideas we need to give it one because it is definitionally not creative. Even hallucinations are just bad calculations on the source. Though they could insire someone to have a new idea, which i might argue is their only artistic purpose beyond simple tooling.
I thoroughly believe machines should be doing labor to improve the human conditon so we can make art. Even making a “fun” game requires an understanding of experience. A simulacrum is the opposite, soulless at best. (In the artistic sense)
If you did consider a sentient machine, my ethics would then develop an imperative to treat it as such. I’ll take a sledge hammer to a printer, but I’m going to show an animal care and respect.
Cells within cells.
Interlinked.
This post is unsettling. While LLMs definitely aren’t reasoning entities, the point is absolutely bang on…
But at the same time feels like a comment from a bot.
Is this a bot?