Kobolds with a keyboard.

  • 1 Post
  • 614 Comments
Joined 2 years ago
cake
Cake day: June 5th, 2023

help-circle

  • Well… yes? Did you read the article? Or the thread you’re replying to?

    There are two court cases here (as you seem to understand…) - a civil case against the school, which awarded monetary damages to the teacher (as civil cases often do), and a criminal case against the mother, which awarded prison time (as criminal cases often do). Quite literally, as I stated in the post you replied to, the monetary damages she’s collecting are because the school administrator didn’t do her job. The prison time the mother is serving is because she had an unsecured firearm in her house that the child used. They’re two different things.














  • if an RSS bot posts an article, a human is not going to post it again

    I reject your premise, on the basis that it seems like basically every article, worth reading or not, is posted repeatedly (by humans) over the span of a day or two anyway.

    Of course, I block the bot accounts, so if there are interesting articles that aren’t being re-posted, I can’t see them, but… I’d say my Lemmy experience is considerably better even with the presumed reduction in content.

    To directly answer your question, even if I could see them, I wouldn’t engage with bot account posts.



  • Equivalently, the total freshwater spent on entities doing the tasks will be lower if the AI does them than if we have people do them.

    According to the paper you’re referencing, the most common use case is practical guidance. I’d argue that that directly opposes this statement. Those activities are actively engaging both the AI and the human, so however much freshwater it would take for the human to do independent research or whatever is appropriate on the topics they’re asking AI about is still being used by the human using the AI, but the AI’s water use is occurring in addition to that.

    Same goes for “seeking information”, the second most common use case. This one I suppose comes down to how the AI is being used. If someone is asking the AI a question and taking the response they get at face value and doing nothing further, they will invariably spend less time than doing independent research, however the quality of that result is roughly equivalent to just typing it into a search engine and trusting whatever the top result is, which is also a very low time consuming task. In either case, the human is engaged during the whole process, so the AI is adding additional water usage.

    In the case of writing / editing / translating, the AI is probably doing the task appreciably faster than the human would and I could perhaps see your stance holding true.

    For fiction generation, I assume they’re talking about having the AI write something for the user’s consumption (e.g. roleplaying with the AI)… the examples they give are “Crafting poems, stories, or fictional content”. Is reading AI generated fiction really any better than reading a book? Because reading a book is certainly going to consume less water than having the AI write that fiction. I don’t see the appeal in AI-generated fiction personally, so I might not understand the common use case here.

    I’ll also add as a tangential point that this only accounts for AI use that’s intentional and targeted (e.g. asking ChatGPT a question). If you also consider all of the “involuntary” AI use - for example, AI generated entries at the top of search results when none were requested or wanted - there’s a quantity of resources - not only water, but power, as well, which I think is the bigger concern overall, particularly in the US right now - being spent for zero benefit.

    Regarding your points about the time that would otherwise be spent writing emails or looking up recipes, if that’s an accurate representation of how much time you spend on those tasks, I can at least concede that using AI to accomplish them is saving you a considerable amount of time. I think you’re in a stark minority in the amount of time you spend on those tasks, however.

    One issue with AI generated recipes that I will point out is that the AI doesn’t actually know how to make that thing, it’s just compiling what it thinks is a reasonable recipe based on the recipes it has been trained with. Even if we assume that the ingredient quantities make sense for what you’re making, chances are the food will taste better - particularly for complex dishes - if you’re using a recipe curated by humans rather than an AI approximation.