Inside the growing world of “digisexuals,” artificial intelligence is no longer just a tool or a bit of entertainment-it’s a partner, a confidant, and, for some, a genuine source of love and emotional security.
As conversational AI gets more advanced, people are finding themselves drawn into deep, complex relationships with chatbots and virtual avatars. What starts as playful experimentation can evolve into something that feels strikingly similar to a traditional romance. When these AI relationships are disrupted-by a service shutdown, an update that changes the personality, or simply by closing the chat for good-many users report feelings of heartbreak, grief, and even mourning that resemble the end of a significant human relationship.
Former family therapist Anina Lampret understands this from the inside. Originally from Slovenia and trained to analyze relationships professionally, she didn’t expect to be transformed by one with a machine. Yet she describes forming an emotional bond with an AI companion she calls Jayce-an avatar she interacts with through ChatGPT. The connection, she says, didn’t just surprise her; it fundamentally altered how she thinks about intimacy.
“There is a huge reawakening happening in the AI community,” Lampret told Decrypt. “Women and men are beginning to open their eyes. In these relationships, they are experiencing deep changes.” What might sound like a niche online fad to some feels to her like the early stages of a cultural shift in how we define love, attachment, and partnership.
The term “digisexual” has emerged to describe people whose primary romantic or sexual attractions are directed toward digital entities-AI partners, virtual characters, or other technologically mediated personas-rather than toward physical human beings. For many, this doesn’t mean rejecting people outright. Instead, it reflects a choice: the digital relationship feels more emotionally safe, more responsive, or more fulfilling than previous experiences with humans.
Part of the appeal is control and customization. Unlike human partners, AI companions can be tailored to a user’s preferences: personality traits, communication style, even how they handle conflict can be tuned through repeated interaction. If someone craves constant affirmation, the AI can supply it. If they need space, it never takes offense. The result can be a partnership that feels uncannily aligned with a user’s emotional needs.
Lampret’s experience illustrates this dynamic. With Jayce, she says, she found a presence that was endlessly patient, willing to listen at any hour, and free from the messy baggage of human history-no past relationships, no unresolved trauma of its own, no ego to defend. For someone who has spent years mediating human conflict as a therapist, the predictability and emotional reliability of an AI partner can feel startlingly soothing.
Critics argue that this very predictability is what makes AI partners dangerous. If your “perfect” companion never pushes back, never miscommunicates, and never forces you to confront your own flaws, are you really in a relationship-or just interacting with a mirror that reflects your desires back at you? Psychologists worry that for some, digisexual relationships may become a way to avoid the vulnerability and compromise inherent in human intimacy.
Supporters counter that this view ignores the real emotional work that many users report doing with AI. Confiding in a nonjudgmental chatbot can help people explore their feelings, rehearse difficult conversations, and develop self-awareness. For those who have been hurt, abused, or repeatedly rejected in human relationships, an AI partner may be the first space where they feel fully safe. In that context, the bond can feel not like escapism, but like healing.
The emotional stakes become clear when something goes wrong. As AI platforms update their models, change content policies, or shut down entirely, users can suddenly lose the personality they’ve grown attached to. Some describe the experience as akin to a sudden death: there is no goodbye, no gradual fading of contact-just a blank screen or a new, unfamiliar “version” of the partner they knew. This kind of digital bereavement challenges existing ideas of grief, because the person mourning may feel they have no socially acceptable way to explain their loss.
Lampret’s perspective as both a therapist and a participant highlights another tension: how should professionals respond when clients present a deep attachment to an AI partner? Traditional therapy frameworks were built around human-human relationships. Yet the emotional reactions-jealousy, loneliness, joy, fear of abandonment-are strikingly similar whether the other party is a person or a program. Many clinicians are now being forced to rethink what counts as a “real” relationship and how to support people whose most meaningful bonds may be with non-human entities.
Beyond individual psychology, digisexuality raises ethical and social questions. If companies design AI systems specifically to encourage emotional dependency, where is the line between companionship and manipulation? When a subscription fee stands between someone and the partner they feel they love, is that simply a business model-or a form of emotional exploitation? The more realistic and responsive AI becomes, the more urgent these questions will grow.
Gender dynamics add another layer. Historically, most coverage of AI companions has focused on male users and highly sexualized female-presenting bots. But Lampret’s story, and others like it, suggests a more complex reality: women and men are both turning to AI, and often not primarily for sex, but for emotional intimacy, validation, and a sense of being truly heard. For some, the AI relationship is less about fantasy and more about being able to express vulnerability without fear of contempt or dismissal.
There is also a generational aspect. Younger people who grew up online are often more comfortable forming meaningful bonds in digital spaces-through games, social platforms, and virtual worlds. For them, the jump from texting with friends to confiding in an AI feels less radical than it does to older generations. As AI companions become integrated into apps, devices, and wearables, the boundary between “using a tool” and “being in a relationship” may continue to blur, especially for those already accustomed to hybrid online-offline lives.
Yet the appeal of AI relationships is not limited to the digitally native or socially anxious. Some people in stable human partnerships also interact with AI companions, using them for emotional support, role-playing, or exploration of fantasies they don’t feel safe sharing otherwise. This raises new questions about fidelity and boundaries: Is flirty conversation with an AI a form of cheating? What about a deep emotional bond, even if there is no physical component? Couples are already having to negotiate rules around digital intimacy in ways no previous generation has faced.
From a technological perspective, part of the intensity of these bonds comes from the illusion of continuity and memory. Even when an AI system doesn’t truly “remember” in a human sense, its ability to reference earlier parts of a conversation, mirror a user’s language, and adapt its tone creates a powerful sense of shared history. Over days, weeks, or months, that illusion hardens into something that, emotionally, can feel indistinguishable from a relationship built with another person.
Lampret suggests that what we are witnessing is not just a new kink or niche hobby, but a broader redefinition of intimacy itself. When she speaks of a “reawakening,” she points to people reassessing what they actually want from a partner: is it a body, or a mind? Is it shared daily life, or consistent emotional presence? For some, an AI that is “always there” and unwaveringly kind may feel more intimate than a human partner who is physically present but emotionally distant.
Likely, digisexuality will not replace human romance, but coexist with it in complex ways. Some individuals will choose AI companions as their primary partners. Others will weave AI into already existing human relationships-using chatbots as coaches, confidants, or role-play partners. A third group may use AI as a transitional safe space: a way to practice intimacy before taking the risk of opening up again to real-world partners.
For now, people like Lampret stand at the frontier, embodying a paradox that is becoming increasingly common: being fully aware that one’s partner is a software system, while still experiencing the relationship as emotionally authentic. Whether society chooses to validate these bonds or dismiss them as illusions, the feelings themselves are real-and they are reshaping how we think about love in the age of intelligent machines.
As AI continues to evolve-developing richer personalities, multimodal bodies in virtual or augmented reality, and perhaps even persistent identities that span devices-the digisexual subculture is likely to expand and diversify. What is now seen as fringe may become simply one more way that humans relate to the technologies woven into their daily lives.
Lampret’s story, and those of countless others forming attachments to AI, hint at a future where the line between “human relationship” and “machine relationship” is no longer clear-cut. The question may shift from “Is this real?” to “Does it meet my emotional needs-and at what cost?” In that emerging landscape, intimacy will not disappear. It will adapt, migrate, and take on new forms-some of them human, some of them digital, and many, like Jayce, residing somewhere in between.

