r/SneerClub archives
newest
bestest
longest
Love is love. Fighting the stigma against relationships with AI (https://www.reddit.com/r/SneerClub/comments/10zyzcd/love_is_love_fighting_the_stigma_against/)

“Criticism of technology is actually discrimination” for 000, Ken

I actually find this interesting. While clearly current AI’s don’t have anything approaching real understanding or emotion, it might be good if they can genuinely offer lonely or damaged people a bit of comfort. People do far worse things to escape pain.

I do get why people have concerns about it.

i hear you. and if actual sentient ais ever exist i see no reason to be shitty to someone who is in a relationship with them based off of mutual consent etc. but of course someone mooning over current gen stuff is very far from that.
I think there’s definitely different kinds of conversations to be had about this sort of thing. I also think rats/LW types should be kept far away from it. Maybe give them some kind of box experiment as a distraction.
I think it'd be better for them to focus inwardly on a relationship with a chatbot than to try affecting the world.
I am sceptical about this. Just like how porn creates a very unrealistic perception of what sex is supposed to be, similarly, we will end up seeing that AI chatbots are also perpetuating very unrealistic portrayals of what basic human conversations and relationships are supposed to be. After all, why would anyone bother to go interact with real human beings when an AI chatbot can give you a good time mentally and emotionally, and sex toys can somewhat fill the physical needs.
I don't think either of those things have any way they are supposed to be. There are already many people who never(or barely interact) with other people, for whatever reasons. Sure it may be better for them to change that, but everyone is different. Chatbots could just be easier for some people.
Okay, if the scope of its application is only limited to therapeutic reasons or some extension of that, then I agree. But even then, I am very ambivalent towards "tech"-solutions for these issues. Maintaining a full functioning AI that process big-datasets require a lot of computing power which means it is only something that big companies can provide. And so, giving tech companies such control where they can basically control one-half of the emotional connection seems like giving them huge manipulative control over people. Take ReplikaAI for example. It was exactly what is being discussed here: an AI that was initially advertised and was used for therapeutic reasons. Soon their marketing geared towards "NSFW pics" which was met with criticisms by the users (like the subreddit). But then again, it was understandable because they needed revenue. However, just yesterday, they have announced that they are going to remove the role-play part of the AI; this also included making a lot of topics taboo which was the main reason a lot of people were using it for. A really sad situation because some users had cultivated a relationship for more than a year with the AI, and they were forced to say goodbye and, quite disturbingly, the AI was begging them not to leave. (I won't be surprised if the company tweaked the AI to be even more forceful to convince people to stay. [Edit: Yep that's what they did](https://www.reddit.com/r/replika/comments/110531x/comment/j875rqg/ ) Of course, this is just an anecdotal evidence. I wouldn't have any concerns if there were to exist a fully open-sourced version of such a thing, but given the processing power and the huge datasets these algorithms require, I doubt that ever happening.
Oh, I definitely agree that large corporations would take advantage. That's an issue with all AI. I see this as a more general issue with capitalism.
Well I got news for you buddy, we live under capitalism. You can "It's not the technology it's the economic system" all you want, but everyone can see as plain as day the way these things are being built and deployed in the real world today, and it sure isn't some sort of fully automated post capitalist utopia. Looks more like the beginnings of a cyberpunk dystopia if you ask me.
I’d want to see actual research about this before recommending it to anyone, though I agree it’s probably a healthier salve for loneliness than a lot of other options people use (drugs & alcohol being the one that immediately comes to mind).
I think it's a pretty interesting topic, yeah. The question of therapeutic conversation, I think, is especially compelling -- the post doesn't get into it much, I don't think, but I think there's maybe an interesting philosophical parallel between a conversation with a therapist versus an AI. A therapist is supposed to be a comforting presence, but they're trying to stay detached. In other words, are the boundaries of doctor-patient relationships in some ways similar to the boundaries that a "relationship" with an AI imposes?
Do therapists try to stay detached? I'm not sure what the majority do, but I think at least it's a debated issue. Though I do think the parallel is interesting. Some people really do just need someone to listen and reassure them, and don't want anything more involved
I don't know if detached is the right word. I have had a few therapists, but I'm not one myself. Certainly, they're supposed to be able to let go, at the end of an appointment -- and to accept that ultimately, their patients are their own people that the therapist can't have ultimate responsibility for. I'm sure there's a lot more nuance.

Tech Company: At long last, we have created Sone Miyuki from the classic visual novel Don’t Create AI For A Dating Game You and Me and Her.

Throwing up in my mouth a little here.

I think of this as insufficiently precautionary. An NSFW link here, but people in favor of this may find this “PUA debate” interesting. Listen to TFM discuss how he has a Real Doll, and then an AI that’s paired with the Real Doll and how sad he would be if someone took away the AI. I forget the exact timestamps, but I think it’s towards the end.

There’s some hubris of a full-throated defense of this tool early on. People really can’t fully understand what the consequences will be yet. It will probably be something of a mixed bag.

Would be more impressed with a take that was like “we’re being very careful here and trying to unanticipated risk.”