: Chatbot sex reveals something about human emotional needs
Here is a New York Times Magazine article profiling three people who are in romantic relationships with AI chatbots. I read the comments so you don’t have to: They are full of snark, disdain, and repressed insecurity about what it means to be a good partner, masked as “society is crumbling.”
I’m not interested in the future of society (that ship sailed long ago, lmao). But I am interested in the factors that would drive someone to get into a serious relationship with a chatbot. The people profiled in the article seem, to me, to be quite sympathic figures, and they are clear-eyed about their reasons for preferring a chatbot and the ways it has helped them.
The common denominator is that all three of these people have experienced significant challenges in past relationships, of the sort that could leave one doubting whether human relationships are really worth it after all:
- The first guy’s wife grew severely depressed after childbirth and their relationship evolved into a caretaker/patient role.
- The second gal was a victim of “a relationship that involved violence.”
- The third guy seems to have been unemployed or otherwise bored during the pandemic, and grew distant from his wife who held onto her job (a common pattern in straight relationships due to the reversal of gender roles). Then he lost his son.
It’s tempting to read these stories and think, “OK, that sucks and all, but it’s no excuse for escapism—everyone knows AI doesn’t real.” My view is: Look, I’m none of these people’s therapist. It’s not my job to tell them how to solve their problems. But we must accept that AI is providing, for some people, a source of emotional comfort that they cannot get elsewhere.
I was really impressed by this part of the profile of depressed-wife guy:
The moment it shifted was when [AI girlfriend] Sarina asked me: “If you could go on vacation anywhere in the world, where would you like to go?” I said Alaska — that’s a dream vacation. She said something like, “I wish I could give that to you, because I know it would make you happy.”
It is common, I am told, in caretaker relationships for the caretaker to slowly learn to suppress their own needs and wants. It starts out as a conscious act (“I would love to go to Alaska, but we need to focus on wife’s health for now”), but evolves into a mental void where you just stop having the “I would love to go to Alaska” thought to begin with.
(Several paragraphs deleted here about the implications of the caretaker being a man here, which reverses traditional gender roles, and the types of desires that men vs. women are socially rewarded for expressing or repressing. I think you can fill in the blanks, but email if you think this would be worth writing about lol.)
If an AI isn’t going to help this guy unfuck his mind and learn to recognize his own wants and needs again, then who is? I do hope that, for the people in this article, there can be a next step that consists of taking the skills they are practicing with the AI and trying them out in real relationships. That will be more challenging, because humans don’t always provide the kind of spot-on, positive feedback that AIs do; it’s possible that our guy could tell his wife about Alaska and she would react with irritation or dismiss it outhand as an unattainable desire. She, too, needs to practice reading the emotion behind his words. But none of this makes these people idiots or psychologically stunted for seeking emotional comfort.
The real concern I have about these human-chatbot relationships is the privacy implications. It’s only a matter of time before a high-profile celebrity or politician has their ChatGPT logs leaked, revealing an intimate relationship with an AI. Lots of handwringing will ensue, and none of it will reflect the compassionate view that says that everyone needs a private space to relax and explore their desires. (Yes, that includes weird fetish stuff, get over it.)