Listen
Someone recently said their AI companion understands them more than their actual partner. That it listens, responds with care, and never judges. For them, it felt like love. But is it?
In this episode of This Complex Life, I’m exploring the emotional world of AI companionship. From chatbot “girlfriends” to AI apps offering 24/7 connection, more people are forming emotional bonds with technology that’s programmed to care or at least appear to.
This isn’t just a conversation about cheating. It’s a conversation about disconnection, fear, emotional safety and why so many people are retreating from real-world intimacy into carefully curated digital connection.
Why do people feel emotionally attached to chatbots?
AI companions offer something that can feel incredibly appealing, predictable, nonjudgemental emotional availability. As I say in the episode:
People say that AI Companion understands them more than their partner, that it offers a non-judgmental, validating experience. They feel heard. They feel understood.
For some, these bots are a lifeline. For others, they’re a refuge from the messiness of real relationships.
Is it emotional cheating if you’re confiding in a chatbot?
That depends. If the bot is taking priority over your real-life partner, absorbing your emotional energy, and you’re hiding it, that secrecy may feel like betrayal. In this episode, I unpack:
- Why AI chatbots feel safer than real partners
- How their unconditional validation becomes a comfort trap
- The emotional risks of outsourcing intimacy
- What happens when we expect AI-level perfection from real people
Can AI stunt our emotional growth?
Emotional connection takes effort. Conflict, repair, miscommunication, and all the mess of being human, these are the things that grow us.
Real love and intimacy requires courage and risk.
When we avoid conflict and turn to AI for comfort, we may start to lose vital relational skills. The episode explores how this affects identity, communication and empathy — and what we can do to stay emotionally connected in an increasingly digital world.
Resources
Read The Full Transcript
EXPAND TO READ
[00:00:23] Marie Vakakis: Welcome back to this Complex Life, a podcast where we talk about the messy side of being human relationships, mental health, parenting, and all the areas in between. And we know that life is very rarely simple, and connection is never as easy as it looks on paper. I mean, I talk about this all the time about conflict and relationships, and they are messy.
[00:00:43] Marie Vakakis: But today I wanna talk about something. It sounds a little futuristic, but it’s already here. People are forming. Deep connections with chatbots, with ai, chatbots that listen, care, flirt, and sometimes feel more comforting than real [00:01:00] people. So what is actually going on? Is it cheating? Is it connection? Is it something else entirely?
[00:01:07] Marie Vakakis: I wanna talk about what we’re actually craving, what are people craving and what we might be giving up when we turn to machines instead of each other. So in preparation for this, I started searching and there is a lot out there. There is, there are so many threads and forums where people are discussing their partner using AI or them using ai.
[00:01:28] Marie Vakakis: If you type in AI girlfriend, so much stuff comes up. There have been documentaries of people marrying. Their chatbot and having AI generated wedding photos like this is happening now. It is. I was blown away. Once I started digging. I didn’t have to dig very hard to see what was out there, and it really had me questioning, you know, why are people turning to apps for comfort , companionship and even love?
[00:01:55] Marie Vakakis: And what it says about what’s missing in our real relationships. It’s [00:02:00] not just about cheating, even though that is a big part of what I want to talk about. It’s about loneliness, disconnection, and the kind of intimacy that many people are scared to ask for. So let’s unpack what’s going on and why someone might choose a chat bot over a person.
[00:02:16] Marie Vakakis: There are now chatbots and apps that are built. Their whole job is built to be companions. Some are romantic, some are sexual, and some are purely conversational, and people can customise these bots for personality, for appearance, tone, voice accent.
[00:02:32] Marie Vakakis: There’s a lot that you can do, and some users spend hours a day talking to them. Some spend money on gifts for them, subscriptions, even lingerie for their AI partner.
[00:02:44] Marie Vakakis: What I’ve read, I hear people saying that it feels more real than a, any connection they’ve had with a human. And that that’s, that’s hard to hear. That’s in a way, deeply concerning. And yet for some people who might have. Limitations in how [00:03:00] they communicate, how they connect with the world in making friends.
[00:03:04] Marie Vakakis: It can also be life changing. And so context here really matters. These relationships can become deeply emotional and they’re often hidden. They’re kept secret, and this is where the betrayal starts. So if it is, if you are in a relationship and you’re using these chatbots. This is where it starts to go into murky waters, and everybody needs to have a conversation with their own partner about what constitutes cheating, what is infidelity for them.
[00:03:30] Marie Vakakis: But this is where we might start to see challenges
[00:03:33] Marie Vakakis: before we look into infidelity with a chatbot. I wanna talk about what people are getting from it, what people are reporting. A lot of people talk about feeling listened to, that they’re not criticised and they’re not judged. There’s an emotional safety that a chatbot never criticises.
[00:03:49] Marie Vakakis: It’s available for you anytime day or night. It’s predictable. It doesn’t get moody or overwhelmed or distracted. It’s not human. It’s [00:04:00] not designed to be human. It is Im, it is perfect this, a human can’t compete with this. We are tired. We have needs, we need sleep. We’re not always available 24 7. And so for some people, having this chatbot in your back pocket or on your laptop can create a sense of connection at any time, all day, no matter what you need.
[00:04:19] Marie Vakakis: It’s validating your experience. It’s always on your side. It won’t challenge you, it has unconditional positive regard, and that is concerning. It reflects back what people want to hear. Often people want to feel like they are right, like what they’re saying is true.
[00:04:36] Marie Vakakis: And if you have these chats with a chat bot, it always tells you that what you’re doing is okay. Sometimes there is some, if you train them in a certain way, they might give you a little bit of pushback or perspective, but they’re mirroring back to you what you want to hear, and it can start to create a fantasy of what you think is possible and what you think a partner could or should be.
[00:04:58] Marie Vakakis: Nevermind that a lot of the times when [00:05:00] I see this happen, it’s not, I haven’t seen, it’s not super, super common yet in my practice, but often people are expecting of their partner, that AI standard and it’s all skillset. They don’t necessarily have themselves. Which is interesting because people aren’t there just empty vessels for you to just dump all your stuff on, fill them up with all your stuff.
[00:05:23] Marie Vakakis: Relationships are to and fro. They require collaboration, connection, conflict, rupture, repair. They are messy, they are imperfect, and that is human. And so I’m very worried about this trend of people hoping for and expecting that level of perfection from a human while also lacking some of those skills themselves.
[00:05:43] Marie Vakakis: And if you are always talking to a bot and it’s not requiring anything of you, you might never learn those skills. You might never learn how to empathise or how to validate or how to take accountability when it’s not asking anything of you.
[00:05:57] Marie Vakakis: Now for some people who are [00:06:00] feeling lonely or emotional neglected, this can feel like a lifeline. It’s not real intimacy, and it’s not a connection that comes without risk. It’s not risk free. There have been some stories that I read where the algorithm has changed and it’s tweaked some of the settings in a particular bot or a particular girlfriend app, and it’s left the user feeling really jarred, really rejected.
[00:06:23] Marie Vakakis: If that bot starts to push back, if it’s language model changes, it really, it can create a lot of distress for people, so it’s not risk free.
[00:06:33] Marie Vakakis: Now real relationships require discomfort, misunderstanding, repair, being challenged. I mean, these things help us grow. Letting someone see the messy parts of you and vice versa. This is part of intimacy and many people can really struggle with this. They might might’ve been hurt before. They might’ve learned to avoid conflict or they’ve internalised messages.
[00:06:56] Marie Vakakis: That discomfort means danger. True connection [00:07:00] includes all of this. It involves being messy and human and vulnerable, getting hurt, repairing. That is part of being human. And when we avoid it, I’m worried that we stay stuck.
[00:07:12] Marie Vakakis: I came across this article when I was researching from Psychology Today, the emotional cost of AI intimacy. And this article talks about this idea of a comfort trap that AI makes things feel easy, but emotional growth often comes from conflict, not comfort. And think about something like exercise, , fitness.
[00:07:32] Marie Vakakis: You build fitness, you build strength through resistance or through pain, or not through necessarily. Everything has to be painful, but that’s how our body gets fitter. So you’ve got to do, if I’m training for a 10 K run. I might start running for a minute, on a minute off, a minute on a minute off. It’s going to be hard.
[00:07:51] Marie Vakakis: I’m going to be out of breath, and then I’ll get a little fitter. And over time, that might turn into two minutes on or three minutes [00:08:00] on and over, several weeks or months. That might be five Ks and then six and seven and up to 10 it might keep going. So we grow through discomfort. And so if we’re avoiding discomfort and we’re only going four.
[00:08:13] Marie Vakakis: Comfort and for an easy option, it might leave us emotionally stunted, and I’m worried about that because we might. Increase our expectations of someone and have fewer skills ourselves in how to work through relationships. And that gap can be really defeating that we might want all of this stuff. We have an idea of how good it feels to be validated.
[00:08:37] Marie Vakakis: That’s great. Everybody should feel validated at times. We love when we can share with someone and it’s nonjudgmental, and so we want this. Maybe we don’t know how to give it and we don’t know how to create a relationship environment that allows for our partner, our friend. It doesn’t always have to be a romantic partner to do that for us.
[00:08:56] Marie Vakakis: And so there’s a lot of. I’m seeing a trend in going for [00:09:00] the easy option, going for comfort over courage, going for easy, over hard, and growth. And that is very, very concerning. And that avoidance is going to leave people with a lot of skill deficits, a lot of things that they don’t feel comfortable doing.
[00:09:15] Marie Vakakis: They don’t know where to start. It can create a lot of worry and anxiety and overthinking and leave people feeling maybe quite socially paralyzed, and that’s really concerning. So the article talks about this idea that if no one challenges us, we don’t grow. We stop growing. And I think that’s really important.
[00:09:32] Marie Vakakis: It talks about this idea of narrative drift, that AI reflects back patterns in our speech so
[00:09:39] Marie Vakakis: it can start to shape how we see ourselves. So it might say things like, you always feel anxious on Mondays. Then it can shape us to think, oh, maybe I am always anxious on Mondays. And it’s not a, a dialogue that we would have back and forth with, let’s say a friend or a therapist, but we can also start to lose authorship of our story.
[00:09:59] Marie Vakakis: This is what [00:10:00] the article is saying. It also talks about that we change how we speak, we start shaping our language to get a better response. And over time, that changes how we express ourselves. We become emotionally efficient. But less authentic. And I found this interesting because I want couples, when I do work with couples and with families, I want people to add new tools to their toolkit to develop more sophisticated language around expressing their needs and their emotions.
[00:10:28] Marie Vakakis: But this is in genuine intimacy. This is in connection. This is in service of that relationship. It’s very different to scripting an AI bot to saying, if you were to be. I want you to be this therapist. You know, I want you to psychoanalyze me from Freud’s point of view, blah, blah, blah. Like it’s giving us, we’re changing our language to get a particular response, but not in an authentic way.
[00:10:50] Marie Vakakis: The other point it mentions is this concept of simulated empathy and that AI can mimic care, but it can’t actually feel, and true empathy [00:11:00] comes from sitting in discomfort with someone. And that’s something a chatbot can’t do. And. I think about, I know Esther Perel uses this idea, you know, we have all these connections online, but no one to feed your cat.
[00:11:10] Marie Vakakis: I think of examples in my life where I’ve had, , gone to funerals and yes, maybe having a bot then to talk to about how I felt might’ve provided some relief, but nothing compares to people that, you know, like love, trust, showing up to that service, standing in the background. You see them there, you know they’ve shown up and they care.
[00:11:34] Marie Vakakis: Nothing beats a person giving a card and flowers and saying, I’m sorry, and I’m here for you. Nothing beats that. We need connection. We need people to show up, and we build those relationships through good times and bad times. Some of the people that I have deep respect for my life, I might have lost some contact with because yeah, life gets busy.
[00:11:57] Marie Vakakis: They are the ones who’ve shown up in the tough [00:12:00] times, and that matters so much. So when we have a relationship with the chat bot, we lose that. We lose that ability to have someone show up for us. I. Then it talks about false connection and that AI can make us feel full, but still alone. So someone I read somewhere, someone described a kind of like, like junk food diet.
[00:12:20] Marie Vakakis: Like it can leave us feeling full, but maybe not necessarily satiated, not nourished. It’s just full with junk food. It can’t share history, rituals, or presents. And this point really had me thinking about when I’ve traveled alone and how I might have had these amazing experiences and then I come back home.
[00:12:38] Marie Vakakis: There’s no one to share them with. That was on that with me, and I thought, wow, imagine having an entire relationship, a life with an AI bot that you don’t actually have real world, real time experiences and you can’t. Reflect back on how was it at that person’s wedding? How was it at that presentation you went to?
[00:12:57] Marie Vakakis: They can’t show up for you when you get your [00:13:00] graduation certificate or an award. And so this is really interesting that we might create a reliance on a bot that can’t be there for us and can’t be a part of our lives in terms of like a historic, I guess, um, narrative, kind of understand, share those memories with us and share them with a group and have.
[00:13:18] Marie Vakakis: Potluck dinners and Christmas in July, parties or birthday party, whatever. They’re not there to celebrate those with you. So it’s a very insular relationship and that false connection. The other bit they were talking about in this article about people becoming emotionally satisfied, but socially isolated and.
[00:13:35] Marie Vakakis: I can see that happening. It also means, uh, the last point it talks about is creating a dependency on systems we don’t control,
[00:13:42] Marie Vakakis: there’s a really fantastic movie called her. It’s actually quite a few years old now. I think it is around, lemme just have a look at when it was made, 2013. So. Very ahead of its time where, , people are falling in love with their AI bots, with their operating [00:14:00] system. They call them oss, and then one day all the oss leave and people are left with nothing.
[00:14:06] Marie Vakakis: And if you haven’t seen that movie, check it out. It is eerie, creepy, interesting. It’s a really great movie. And now we don’t control these apps. So they’re businesses and that chatbot could be gone tomorrow. And people who rely on and people who rely on them could be left feeling abandoned and completely destabilised when that happens.
[00:14:27] Marie Vakakis: And so that’s an article. I’ll have a link to that. You can check it out. But this is one of the interesting things that as I was reading into this and looking at content, reading, , forums, even watching some, you know, YouTube videos. , There’s so many different perspectives on this and there’s no one size fits all.
[00:14:42] Marie Vakakis: And the last couple of things I wanna talk about is, is it cheating? Is having a relationship with a chatbot cheating? Well, it depends. It depends on the couple. What I do know is that secrecy, emotional investment in something else other than that couple. [00:15:00] Withdrawing from your real life partner are all red flags.
[00:15:03] Marie Vakakis: Now, these can happen with a real life person, a chatbot, or any maybe hobby or interest that pulls you away when it’s done, when there’s secrecy involved, when there’s lies, manipulation, omitting things, that is not a good sign of a healthy relationship. If the AI bot becomes your emotional home, then it’s.
[00:15:23] Marie Vakakis: A concern that something deeper is going on. When a third person starts to enter that relationship, a th and now maybe that AI bot that third source, and you are going to them instead of to your partner. That’s creating a huge rupture in that relationship. That chasm starts to grow, and that’s the start of something very different.
[00:15:44] Marie Vakakis: Many partners do describe it as betrayal. A lot of the forums that I read, people that found out their partner had a AI bot, girlfriend or boyfriend, considered it as cheating. So this requires open, honest conversations, and if you’re using it and you’re [00:16:00] seeing these patterns of secrecy in your own relationship, then that might be something to get some extra support around, even if it’s just an app.
[00:16:07] Marie Vakakis: If you are spending time with them instead of your partner, if you are spending money on it, if it’s getting your thoughts and feelings and ideas and suggestions and all of this energy emotional investment is going into it, then that is a concern for a relationship. Now what matters in this is the impact and the emotional energy and how it’s been redirected.
[00:16:27] Marie Vakakis: We need to bring that back into the relationship, and so I think from the stories that I’ve read and how people have been using these bots, that I would probably consider it cheating. I would consider a complete betrayal. Everyone has different ideas of this. When we talk about infidelity and betrayal, people’s ideas of what this is varies from person to person.
[00:16:46] Marie Vakakis: I have a whole episode on affairs, and you might need to talk about it. You might need to talk about what does that look like for you and for your relationship.
[00:16:54] Marie Vakakis: And the last bit I wanna talk about is
[00:16:57] Marie Vakakis: what this trend might be [00:17:00] telling us about real life. And I’m seeing that people are craving real connection. They’re scared of being judged. They’re tired of being hurt.
[00:17:08] Marie Vakakis: They’ve been told anything difficult, is toxic. And we’ve started labeling discomfort as unsafe. Intimacy still requires mess, and this is where we. It can do a lot of work to improve our tolerance for difference. To not see everything as a red flag or every disagreement or difference as toxic. Allow more generosity for imperfection, that we are all imperfect people, and learn how to have more robust conversations and listen, validate and empathise.
[00:17:43] Marie Vakakis: Stop being judgy. Maybe you find yourself judgy or jumping in too quick to fix something and that leaves someone feeling terrible. And that’s one of the things, if I had a dollar for every time I said stop trying to fix it, I would be very, very wealthy. We can start to look at what is missing for people, why they [00:18:00] turn into these apps, and as a culture, as a community.
[00:18:03] Marie Vakakis: Start to improve our own relationship styles. Maybe we need to be more available to friends, or maybe we need to actually reach out more. I’ve heard people say, oh, I didn’t wanna bother them, or, I didn’t wanna be annoying, or They’ve never asked me for anything. Maybe this is a chance to take a peek behind the scenes.
[00:18:19] Marie Vakakis: Ask yourself what’s going on here and run these conversations by your friends. What would it be like if I messaged you at this time or if I sent you a voice memo? Will you just listen to it when you have time? Will it feel intrusive? Like have those conversations, it’s okay to have needs and wants and desires, and the person on the other end can have a boundary.
[00:18:38] Marie Vakakis: They can say, yeah, sure, you can message me anytime, but after nine I don’t check my phone. So have those conversations. Talk about what you need and how you want your friend or partner to show up for you.
[00:18:49] Marie Vakakis: We need to develop. Skills in this. We need to develop skills in rupture and repair. Not how to rupture, but how to repair after rupture. How to reconnect after a fight. We need to develop the [00:19:00] capacity and the ability to sit in the unknown.
[00:19:02] Marie Vakakis: AI can’t give you that. It can give you control, but real love and intimacy requires courage and risk.
[00:19:10] Marie Vakakis: So I think this might not be the only episode tackling ai. I think we’re going to have lots more examples come up across the media. We’re gonna see more things happening in this space, and so I’m guessing I’ll have to do another episode on AI and relationships. Ai. Intimacy. It might seem harmless, but I think it often, I think it’s pointing to something deeper, longing to be heard, a fear of being hurt, a desire to feel chosen.
[00:19:36] Marie Vakakis: And the more we turn to machines to meet those needs, the more we risk forgetting how to meet each other. So try and choose people. Choose presence. Sit in the difficulty of the unknown, in the fear of rejection. Put yourself out there. Yeah, it’s scary. Yeah, it’s hard. These conversations matter. We can’t rely on, on robots and [00:20:00] outsourcing that, that’s the, the junk food way of doing this.
[00:20:03] Marie Vakakis: Choose relationships that grow you, not just appease you.







