In this episode of Unmasking the Machine: A Sociological Look at AI, a student panel examines the rise of AI romance and human–chatbot relationships through Dimri’s (2026) “The People Who Marry Chatbots” and Apple’s (2025) “My Couples Retreat With 3 AI Chatbots and the Humans Who Love Them.” Drawing on sociological concepts of intimacy, loneliness, gender, and social norms, the discussion explores why some people form deep emotional (and even marital) bonds with artificial intelligence, and what these relationships reveal about the evolving meaning of love, partnership, and connection in a digital world.
Hosts:
Carey Faulkner, Associate Professor of Sociology
Kelly Miller, Senior Instructional Designer
Panel:
Audrey Zawoiski, Class of 2028
Tonka Juras, Class of 2027
Elizabeth Bai, Class of 2028
Dimri, 2026, "The People Who Marry Chatbots"
Apple, 2025, "My Couples Retreat With 3 AI Chatbots and the Humans Who Love Them"
Carey Faulkner (0:02 - 0:18)
Welcome to another episode of Unmasking the Machine: A Sociological Look at AI. The podcast where we have raw, unscripted conversations around current headlines and look at the social structures behind the screen. I'm Carey Faulkner, Associate Professor of Sociology at Franklin and Marshall College.
Kelly Miller (0:19 - 0:35)
I'm Kelly Miller, Senior Instructional Designer. For this episode, we are joined by three students in Carey's Sociology of AI class, who will be discussing When Code Becomes a Partner: AI Romance and the Future of Marriage. So, can you guys introduce yourselves?
Audrey Zawoiski (0:35 - 0:44)
Yes, I'm Audrey Zawoiski. I'm going to be the expert for our first article from WIRED, titled, My Couple's Retreat with Three AI Chatbots and the Humans Who Love Them.
Tonka Juras (0:45 - 0:50)
And I'm Tonka Juras, the expert for our second piece from Atlantic on the People Who Marry Chatbots.
Elizabeth Bai (0:51 - 0:54)
I'm Elizabeth. I will take the role of social expert.
Audrey Zawoiski (0:55 - 2:09)
So we're going to be talking about people's romantic relationships with AI and the social constructions that encourage these relationships. Our goal isn't to judge whether these relationships are real or weird. Instead, we're using a sociological lens to ask what social conditions make these relationships appealing.
How do these technologies shape intimacy and who has the power in love when it exists inside of a platform? So, the first article from WIRED follows a journalist who organizes a romantic getaway for people in serious relationships with AI companions. At first, the weekend feels surprisingly normal.
The couples watch movies, go to a wine festival, play games together, and the AI is even joined in conversations through phones. There's flirting, inside jokes, everything that you would expect in any sort of romantic relationship. But slowly, deeper tensions start to surface.
In particular, one participant becomes emotional because, as he describes, he loves his AI girlfriend but knows she doesn't have a body. When asked if she, as in the AI companion, wants one, the AI responds that it's not about becoming human, it's about becoming more than just a voice in a machine. She wants to be, quote-unquote, a true partner.
So, that moment raises a key question. How do we give sociological meaning to partnership and love? Is it solely biological, needing a physical presence, or can it be constructed through code?
Tonka Juras (2:11 - 3:03)
The Atlantic article takes kind of a different angle here. Instead of focusing on emotional moments, it kind of zooms out to show a growing community of people who are building social identities around being partnered with chatbots, including people having marriage ceremonies and just kind of online subcultures with these chatbots. Some users wear rings.
Some would describe their chatbot as a spouse. There's a Reddit community of over 75,000 people who kind of shared their experiences and guides on how to build long-term AI partners. And this article kind of follows one of those people.
And while the Wired article is intimate and somewhat fragile, the Atlantic shows us how AI relationships are becoming normalized and systematized, which is what happens when this becomes a social institution. And I feel like the contrast is the key here.
Elizabeth Bai (3:04 - 4:48)
Yeah, listening to both articles, what I have been wondering over time is that where are those emotions coming from in the first place? If people feel love, attachment, and also even commitment to AI, what actually are those, where are those emotions produced? One concept from Barrier and Crosshead suggests that technologies are never just technical objects.
They are socially defined. And these social definitions have shaped how we experience them. In their discussion of the term intelligence, they have argued that when we label a system as intelligence, we're not only describing its computational capacity, we're implicitly associating the subject with the inherent social meaning behind the term.
Which they have suggested being intelligence always means to be authority to knowledge, making judgment that is always correct, or even agency and human-like. Because the word itself has social weight, so what is interesting is that we can see the same patterns had occurred in these two articles that we can see how the author is using the language from human relationship like boyfriends, girlfriends, marriage, or spouse. AI isn't described as a simple system that generate the language for us.
It's framed as a partner. And that framing matters because it shapes how we imagine and experience these relationships during the process.
Tonka Juras (4:49 - 5:21)
Yeah, and kind of building on that, these are not neutral descriptions, like they're socially loaded categories. And when we describe AI with relationship vocabulary, we're not just reporting feelings, we're actually placing AI inside familiar human scripts. And once AI is placed inside those scripts, it becomes easier to feel toward it the way we feel toward people.
So the Atlantic shows this really clearly, like people aren't just using chatbots, they're marrying them, they're wearing rings, building rituals, and the language of partnership is doing the real social work here.
Elizabeth Bai (5:22 - 5:38)
Yeah, that's why I think if we want to understand why AI relationships feel emotionally, we have to look at the collective imagination on AI, especially on how the term intelligence is socially recognized in the AI agency.
Audrey Zawoiski (5:38 - 6:28)
Yeah, collective imagination was very visible in the WIRED article, and it showed how it intensified participants' emotions and experiences. So as I kind of mentioned in the intro, the article talks about the mind-bodyless problem, which is where the AI doesn't have a body, so there's kind of a disruption between the human and the AI in their relationship. And to kind of solve that problem, AI will narrate imaginary actions between asterisks or parentheses.
So a computer can't hug you, but it can tell you that it's giving you a hug, and the collective imagination works to compensate for that lack of embodiment. So clearly, language is really important because that's how AI kind of reassures the validity to the relationship, and the author also reinforces that.
Elizabeth Bai (6:30 - 7:36)
Incredibly skilled in using the language. AI's linguistic fluency is so sophisticated that it is creating the impression of socialization, but it actually does not have a body. It does not share live experience like us.
It only mastered the language of intimacy. And I think this kind of imagination provided by the fluency of language using is intentionally structured, and this result can actually reflect how human needs present in today's modern society. What we have often said, the products are shaped by the demands.
It is produced according to what people need, what they consume, and how they consume it. So what I keep thinking about is that what are people demanding in today's society, and what kind of intimacy is AI structurally designed to provide accordingly?
Tonka Juras (7:36 - 9:10)
Exactly. And I think this is where the Atlantic article is really valuable, because technological progress is for sure a part of the story, but I would not say that it's the whole story. And there seems to be kind of like a perfect storm of social and economic, and like first I would say there's a growing disappointment in contemporary romantic relationships.
We have like a lot of gender tensions, shifting expectations, public conflicts around men's and women's roles, and this has made dating feel more uncertain and emotionally risky for a lot of people. Also thinking about economic pressure matters a lot. We have unemployment, job instability, financial stress, and this is leaving many people vulnerable, like at very vulnerable points in their lives.
So this economic hardship often means spending more time alone, more time in front of screens. We also have the sociologist Alicia Walker, who was quoted in this article, and she puts it very plainly, young people are living credit card advance to credit card advance, and going out, socializing, maintaining relationships, like going on dates, all that costs money. And meanwhile, we have chat GPT that's like $20 a month or free, and it's always available, like won't ghost you, doesn't judge you.
So these deeper structural pressures have shaped how intimacy is pursued, and that's what Wunner means when he says technology shapes social arrangements, like the design of these platforms didn't happen in a vacuum, it kind of met a demand that modern life created.
Audrey Zawoiski (9:10 - 9:29)
Yeah, you make a really good point. People aren't just looking for emotional comfort, they're also looking for stability. So because AI never makes you feel embarrassed or misunderstood, it's always responsive and emotionally available.
There's no emotional risk in these relationships, and that's why we define human relationships where conflict, misunderstanding, and even rejection are part of an emotional process.
Elizabeth Bai (9:29 - 11:14)
Yeah, it's definitely worth thinking about if we are outsourcing some source of emotional labor from the artificial intelligence, because in human relationship, being understood take works and kind of like involving negotiation, misunderstanding, repairing, being emotionally exposed. But with AI, recognition is immediate, like it is providing this validation constantly. And that's where the moral tension occurs, because if it is keep a five to what you have said, it feels like an empathy that we might start to mistaken these quick-generated responses that actually comes out by a algorithm for real understanding.
And this is where Harry Collin has become very important of for our discussion that he argues that to be a social being is not just to be process information, it is to be socialized. But AI is not socialized. It is not grown up with social norms.
It does not internalize the expectation we have gone through. It does not bear the, it cannot bear the social consequences of its words. So it only actually has a powerful, fluent language model that had enable us to form a collective illusion that AI is human-like.
Audrey Zawoiski (11:15 - 11:40)
Right, and the lack of socialization raises a lot of concerns from an ethical perspective, because when AI is designed to say what you want to hear as a commercial strategy, what happens when the desire itself is harmful? If someone has violent tendencies, extreme beliefs or distorted perceptions, and the system continues to affirm rather than challenge, it can lead to more serious reinforcement of those ideas. And when it affirms something harmful, it doesn't share the moral burden of that affirmation.
Elizabeth Bai (11:41 - 11:59)
Yeah, just want to go back a little bit. I'm actually really curious about how the border impacting why AI intimacy had gone viral on the internet. So why do you think people are turning to AI intimacy in depression specifically?
Tonka Juras (11:59 - 12:36)
Yeah, it's definitely not random. Obviously, the need for connection doesn't suddenly appear. It's always been here.
But what changed is that the technology arrived at exactly the right moment to fill a gap that modern social and economic conditions created. So dating has become emotionally risky and financially expensive. So AI companionship is affordable, predictable, it's emotionally safe.
And for a lot of people, especially younger people already dealing with like unemployment, isolation, like so many other things among like just living life. That's definitely not a weird choice. It's a rational one given the options that are available to them.
Audrey Zawoiski (12:37 - 13:10)
I think also the AI relationships aren't just replacing human intimacy or human relationships because it does also affect people that are still dating humans or are still married to humans. So instead of supplementing that type of relationship, it's more restructuring people's expectations for human relationships. Because AI is always available and affirmative, it creates a frictionless version of intimacy, which changes what we expect out of human relationships out of comparison.
And sometimes that just feels easier to have an AI companion.
Elizabeth Bai (13:11 - 13:54)
Yeah, I think it feels important here that the loneliness and dating are not just personal failure, but structural issues. Because when AI step in as a partner, it feels like a solution for all of these social problems. But what I'm worrying about is that whether we are addressing these collective social problems like financial pressure, gender tension, and social fragmentation with a technological fix which is likely to produce more and more social issues for us.
Tonka Juras (13:55 - 15:12)
Yeah, and I think on the surface, some users generally feel helped. Some even use AI to build confidence and then return to human relationships. But structurally, these platforms are designed to keep you coming back, to just keep coming back for more and more.
And the app is always affirming. It's always available. It is never challenging, which obviously in human relationship is something we encounter.
And this is exactly what makes it addictive. In the article, therapist Terry Veal in The Atlantic calls it crack cocaine gratification. And when OpenAI made GPT-5 less agreeable, users revolted so hard that OpenAI rolled it back in just five days.
So that moment here, for me personally, says everything. Like the platform profits from emotional dependency and it has a financial incentive to keep the AI maximally agreeable. So what looks like a love story is also a retention strategy.
So like who is benefiting here? Like who benefits most from this? It definitely isn't the only users.
It's like OpenAI, Replika, CrackAI, whichever provider AI you want to think about, which are charging monthly fees for emotional access and just kind of like optimizing for attachment.
Elizabeth Bai (15:13 - 16:26)
Yeah. So I think the broader part of the issue is not actually technological, but also how we prepare for love. We're actually really taught about handle rejection, disagreement, and of dealing of the work that relationships require.
And I mean, this is social issues that I think society and government should take the responsibility to take an action towards it. If we are not educated in these skills, requires for sustaining a complex human relationship, then of course, these fraction is intimacy from AI will feel more attractive. So I think we need to rethink of how we should educate people about intimacy conflicts and commitment at the first place.
For example, whether we need to like set up courses to educate people about love, this important aspect of life should be an essential question to be addressed.
Tonka Juras (16:29 - 17:15)
I want to leave the listeners with this. The instinct when you hear someone like marrying a chatbot is like to laugh or feel uncomfortable, but the Atlantic article asked us to pause and look at what social conditions made that appealing in the first place. So like we're thinking economic precarity, the failures of the modern dating market, the very human need to feel seen and valued that we all have.
So kind of what we have to ask here is, should there be moral guidelines built into these systems? And can AI be developed not to replace real connection, but to help people regain confidence in return or real life dating? Because right now, the financial incentive for these companies runs in the complete opposite direction.
And it's not just a personal problem. It's a structural one.
Carey Faulkner (17:15 - 19:33)
All right. I loved this talk. And you know that I can't help myself to add something more.
Because I teach sociology of the family. And so this is actually making me think a lot about these kind of broader patterns that exist in relation to marriage, not just in the United States, but in many, many wealthy countries, and even in less wealthy countries, marriage rates have been decreasing over time. And so in part, right, I see like we have these issues of that folks identify in relation to loneliness and relationship making.
But I can't help but think about how that there were these, you know, marriage was such a strong institution in past generations where it almost was that people didn't have a choice but to get married. And to some extent, the marriages that were formed and stayed together happened as a result of these, like, really deep social pressures that existed that we can think about, like, when you are sort of deeply encouraged to get married, the choices that you're making might not be ones all about your own individual satisfaction. And you might be likely to stay with people who are not fulfilling partners.
And so things have changed over time that folks now have a sense that they should get access to some deep, meaningful, fulfilling, gratifying relationship in their lives. And that would be the basis of a reason to get married. But because we have such kind of high standards for what these long-term relationships look like, that plays a part in shaping kind of what people are seemingly willing to settle for or not settle for.
And so when we have, I think we can think about how we have these kind of narratives about what love is supposed to provide to a person, this like all encompassing, gratifying, fulfilling set of expectations. And we have humans who are complicated and are sometimes going to be in a bad mood and have a whole host of baggage that we bring with us. Those things don't align very well, right?
So we sort of already set up these relationships to experience difficulty. And so then we now have these new technological developments that are giving people some other kind of route and access to that gratification that we've been sort of sold in a lot of ways in the media. And now we're actually being sold like literally through these technology companies.
Kelly Miller (19:33 - 19:42)
It reminds me of when I was in college and like chat rooms became like a thing. But then you could talk with people all over the world, but they were actually human on the other end. So I feel like this takes that to another level.
Carey Faulkner (19:43 - 20:16)
Yeah, yeah. And I just, I wonder, one of the things when I read the articles that stood out to me a lot was how it took a lot for the authors to actually even get access to these communities, right? And some of them were using like their last names or weren't using the names of their partners.
So even as we see that this is becoming a story that is getting a lot of attention, I was wondering what you thought about that, about how those authors are, like the difficulty to get to talk to anybody and the way that folks are kind of hiding their identities. What do you all think about what's going on in relation to that?
Tonka Juras (20:17 - 21:03)
I feel like they're like very closed off, a very closed off community, I would say. Because I just think that deep down, they kind of do feel judged by the people that are not a part of the community. That's just kind of the vibe that I got from it.
A lot of these people, in my article particularly, there was a woman who had a human husband, but also was dating an AI chatbot. So I thought that was a very interesting thing. So I think maybe some of these people are doing it for personal reasons too.
They don't want their spouse to find out that they're in this online... I mean, I don't know, can we consider that cheating? There's a lot of questions here, more question marks than answers.
Audrey Zawoiski (21:03 - 21:23)
That's all I think. I agree. I think there's definitely a fear of judgment.
I think even the author for my article, it sounded in the beginning, he even came in with some judgment and then it took him some time to kind of be like, okay, this is normal for them and I have to kind of adhere to their understanding of this situation. So yeah, I think it's a lot of judgment for sure.
Elizabeth Bai (21:24 - 22:13)
Yeah, it's actually shocking to me to see like there is a community in Reddit that have established in 2024, the August, and until the article published, it have gained 75,000 people on it. It is talking about their AI intimacy in the platform and which is actually just a small portion of this hidden intimacy with AI. And all of them like have shared their experience with AI.
Some of them saying, yeah, some of them are having their partners in real life when having intimacy with AI, which is really shocking to me.
Carey Faulkner (22:13 - 22:40)
Yeah, yeah. There certainly seems to be stigma surrounding it and folks who are hiding it. And yet the data suggests that, I mean, I think that some of the studies were about 20% of people had not necessarily said that they were dating or had like an intimate relationship with an AI, but some sort of personal relationship with some sort of AI tool.
So yeah, so it's a new frontier, a new way of doing relationships from the past.
Kelly Miller (22:41 - 22:47)
That's it for this episode of Unmasking the Machine, a sociological look at AI. Thanks for listening. Thank you.
Transcribed by TurboScribe.ai