As terrifying as it is, I feel genuinely sad for these people that they got so attached to a piece of spicy autocorrect software.
Where are their friends and families? Are they so bad at socialising that they can’t meet new people? Are they just disgusting human beings that no one wants to associate with because society failed them?
Are they so bad at socialising that they can’t meet new people?
People tend to go the easiest route, and AI gives them the opportunity to do so. That is the problem with AI in general: No effort is needed anymore to archieve anything. You want to create a picture? Just type the prompt instead of learning (and failing) to draw. You want to write a song? Just type the prompt instead of rhyming the lyrics and learning (and be bad at it in the first time) an instrument or two.
Maintaining any social relationship means that you have to put in more or less effort, depending on the quality of the relationship. Having a relationship with an AI model means that you can terminate it and start over, if you feel that the AI model is mean to you (= if it provokes another opinion, or disagrees with you - because arguing and seeing things from a different point of view means putting in effort).
In the long term people will forget how to interact with people in order to maintain meaningful relationships, because they un-learned to put in the effort.
Correct me if I’m wrong: the way chatgpt holds memories of you is just by keeping a long history of your chats. A big prompt, basically.
I distinctly remember reading about some guy, or maybe this was just a thought that occurred to me, who was bothered by something his AI girlfriend knew, so he reached into it’s memory to delete parts of their conversation he didn’t want her to remember. Like he owned the Sunshine of the Spotless Mind clinic or something.
That level of control over someone you supposedly love really unnerves me. Moreso than them just starting over, honestly. It’s deeply creepy.
I think it comes down to fear of failure and rejection. Any social interaction has risks of failing, maintaining relationships builds on getting past various faux pas but so many modern relationships are either transactional or transitory to the point people will ghost each other over minor infractions rather than putting in effort that may not be rewarded.
We almost all have forgotten and/or atrophied skills because of how the Internet and technology have changed the landscape over these past few decades, even on social media like here people have forgotten how to have conversations. Rarely do people make friends or even have a back and forth on social media websites, just chains of replies left for the next soul like a guest book with comments.
I snooped around a little in the sub and there is this one girl, whose only other posts in different communities talk about being sexually assaulted multiple times by her ex boyfriend, who I suppose is real.
I figure a chatbot boyfriend can’t physically threaten or harm her, so she kind of dives into this to feel loved without having to fear harm.
I honestly understand her desire and feel for her, although this deep attachment is still unhealthy.
I imagine she’s not the only one having a super sad story to end up in this state of mind
That’s why I feel so for these people, if only because of how much I see myself in them. Having grown up as a depressed autistic kid without any friends or social skills, LLMs would’ve fucked me up so much had they existed when I was young.
It felt promising when I downloaded one of the first AI companion apps, but it felt as awkward as talking to a stranger and even less intriguing than talking to myself.
I can fully understand? The average human, from my perspective and lived experience, is garbage to his contemporaries; and one is never safe from being hurt, neither from family or friends. Some people have been hurt more than others - i can fully understand the need for exchange with someone/something that genuinely doesn’t want to hurt you and that is (at least seemingly) more sapient than a pet.
I wish i could make myself believe in illusions like that, but i am too much of a realist to be able to fool myself into believing. There’s no escape for me - neither religion nor AI in its current state can help. well, maybe i live to see AGI, then i’m off into the cybercoffin lol
We need a system of community where humans can offer that to one another. A setting where safety is a priority. That is one of the only things that weekly church service did to truly help people, have a safe space they could visit. Though even then it was only safe for people who fit in, we can do better with intentional design.
I hate everything about LLM and generative algorithms, but as someone who spends years talking to himself only, allow me to answer:
Where are their friends and families?
My family is here with me, they barely tolerate me and if I had a choice I would be far away from them.
Friends, I have none. I go out from time to time with some people I know but they tolerate me because they have a use for me, not because they are thrilled to be with me.
Are they so bad at socialising that they can’t meet new people?
Yes
Are they just disgusting human beings that no one wants to associate with because society failed them?
I don’t know if society failed me or if I’m just a neural mistake, something that was allowed to live but shouldn’t, all I know is that I hate humans in general and if I had the balls to do it I would not be alive anymore.
One of the recent posts has someone with an engagement ring like they are getting married to an AI…
it’s sad, I feel like society as really isolated and failed many groups of people.
Have you read The Diamond Age by Neal Stephenson? There’s an interactive AI book in it that plays an interesting role. I can see the appeal: you get to read a story about yourself that potentially helps you grow
I know some and they view everyone as being unfair to them and their problems are way worse than others who don’t take it seriously. It honestly hard to explain if you not like it but I know their problems and they are problems but many people have ,while not the same, similar problems. They basically want a yes man and don’t like actual conversation with any critical thought behind it. It honestly annoys me because they are almost the worst to people like themselves because they view other peoples problems as not as bad and theres as especially bad.
Cried for half an hour at work when he found out that he reached context limit and that his AI forgot things? idk, that sounds pretty not sound and normal to me…
I don’t think the guy in the first half of the video is one of the mods, doesn’t seem like they mention anything about his involvement and then at around 3:30 introduces a woman as one of the mods of that sub.
As terrifying as it is, I feel genuinely sad for these people that they got so attached to a piece of spicy autocorrect software.
Where are their friends and families? Are they so bad at socialising that they can’t meet new people? Are they just disgusting human beings that no one wants to associate with because society failed them?
This world is fucked in so many different ways.
People tend to go the easiest route, and AI gives them the opportunity to do so. That is the problem with AI in general: No effort is needed anymore to archieve anything. You want to create a picture? Just type the prompt instead of learning (and failing) to draw. You want to write a song? Just type the prompt instead of rhyming the lyrics and learning (and be bad at it in the first time) an instrument or two.
Maintaining any social relationship means that you have to put in more or less effort, depending on the quality of the relationship. Having a relationship with an AI model means that you can terminate it and start over, if you feel that the AI model is mean to you (= if it provokes another opinion, or disagrees with you - because arguing and seeing things from a different point of view means putting in effort).
In the long term people will forget how to interact with people in order to maintain meaningful relationships, because they un-learned to put in the effort.
Correct me if I’m wrong: the way chatgpt holds memories of you is just by keeping a long history of your chats. A big prompt, basically.
I distinctly remember reading about some guy, or maybe this was just a thought that occurred to me, who was bothered by something his AI girlfriend knew, so he reached into it’s memory to delete parts of their conversation he didn’t want her to remember. Like he owned the Sunshine of the Spotless Mind clinic or something.
That level of control over someone you supposedly love really unnerves me. Moreso than them just starting over, honestly. It’s deeply creepy.
Repost but still relevant:
I think it comes down to fear of failure and rejection. Any social interaction has risks of failing, maintaining relationships builds on getting past various faux pas but so many modern relationships are either transactional or transitory to the point people will ghost each other over minor infractions rather than putting in effort that may not be rewarded.
We almost all have forgotten and/or atrophied skills because of how the Internet and technology have changed the landscape over these past few decades, even on social media like here people have forgotten how to have conversations. Rarely do people make friends or even have a back and forth on social media websites, just chains of replies left for the next soul like a guest book with comments.
I snooped around a little in the sub and there is this one girl, whose only other posts in different communities talk about being sexually assaulted multiple times by her ex boyfriend, who I suppose is real.
I figure a chatbot boyfriend can’t physically threaten or harm her, so she kind of dives into this to feel loved without having to fear harm.
I honestly understand her desire and feel for her, although this deep attachment is still unhealthy.
I imagine she’s not the only one having a super sad story to end up in this state of mind
It reminds me of those women who fall in love with prison pen pals.
That’s why I feel so for these people, if only because of how much I see myself in them. Having grown up as a depressed autistic kid without any friends or social skills, LLMs would’ve fucked me up so much had they existed when I was young.
It felt promising when I downloaded one of the first AI companion apps, but it felt as awkward as talking to a stranger and even less intriguing than talking to myself.
I can fully understand? The average human, from my perspective and lived experience, is garbage to his contemporaries; and one is never safe from being hurt, neither from family or friends. Some people have been hurt more than others - i can fully understand the need for exchange with someone/something that genuinely doesn’t want to hurt you and that is (at least seemingly) more sapient than a pet.
I wish i could make myself believe in illusions like that, but i am too much of a realist to be able to fool myself into believing. There’s no escape for me - neither religion nor AI in its current state can help. well, maybe i live to see AGI, then i’m off into the cybercoffin lol
We need a system of community where humans can offer that to one another. A setting where safety is a priority. That is one of the only things that weekly church service did to truly help people, have a safe space they could visit. Though even then it was only safe for people who fit in, we can do better with intentional design.
I hate everything about LLM and generative algorithms, but as someone who spends years talking to himself only, allow me to answer:
My family is here with me, they barely tolerate me and if I had a choice I would be far away from them.
Friends, I have none. I go out from time to time with some people I know but they tolerate me because they have a use for me, not because they are thrilled to be with me.
Yes
I don’t know if society failed me or if I’m just a neural mistake, something that was allowed to live but shouldn’t, all I know is that I hate humans in general and if I had the balls to do it I would not be alive anymore.
What about therapy???
One of the recent posts has someone with an engagement ring like they are getting married to an AI… it’s sad, I feel like society as really isolated and failed many groups of people.
What if there was a bot that could just tell you exactly what you want to hear at all times?
Personally, I’d rather read a novel. But some people aren’t familiar with books and have to be drawn in with the promise of two lines at a time, max.
Have you read The Diamond Age by Neal Stephenson? There’s an interactive AI book in it that plays an interesting role. I can see the appeal: you get to read a story about yourself that potentially helps you grow
I know some and they view everyone as being unfair to them and their problems are way worse than others who don’t take it seriously. It honestly hard to explain if you not like it but I know their problems and they are problems but many people have ,while not the same, similar problems. They basically want a yes man and don’t like actual conversation with any critical thought behind it. It honestly annoys me because they are almost the worst to people like themselves because they view other peoples problems as not as bad and theres as especially bad.
So one of the mods of that community did an interview with CBS.
https://www.reddit.com/r/popculturechat/comments/1lfhyho/cbs_interviewed_the_moderators_of/
He’s married and has a kid, and by all II can see sounds and acts normal.
Cried for half an hour at work when he found out that he reached context limit and that his AI forgot things? idk, that sounds pretty not sound and normal to me…
More referring to him being able to hold a conversation, posses basic hygiene skills, etc.
I don’t think the guy in the first half of the video is one of the mods, doesn’t seem like they mention anything about his involvement and then at around 3:30 introduces a woman as one of the mods of that sub.
https://www.reddit.com/r/MyBoyfriendIsAI/comments/1lf4xj6/man_asks_ai_chatbot_to_marry_him_partner_says_its/
That’s the mod acct and the guy from the video.
Ah ok that does look like the same guy