• Menu

Artificial intimacy: Meet the players in the manipulation economy

Jan 11 2024


Rahaf Harfoush

In the tapestry of relationships, one thread weaves through consistently—that of compromise.

Whether deciding where to eat, or how to balance career and family, give-and-take is fundamental. It's how we navigate myriad expectations, behaviours, and preferences. In shared concessions, enduring personal and professional bonds are formed.

Yet it seems the age-old script of compromise may be facing a dramatic rewrite. A groundbreaking wave of technology introduces a new player to the game—the AI digital companion.

AI-powered companions promise an unparalleled level of intimacy, custom-tailored to spec: Tone, appearance, traits, opinions, and responses. This partner requires neither time investment nor emotional labour; they exist solely to provide you with companionship. It could be in the form of a chatbot you've subscribed to, playing the role of a romantic confidant or understanding friend. This is already a reality for many, a testament to the growing intricacy of our relationships to artificial intelligence. 

While the sex-tech industry (valued at $30 billion) has been around for decades, the current wave of generative AI expands artificial intimacy beyond sexual fulfilment and into the realm of emotional connection: AI companions will be friends, lovers, coaches, and mentors. Companies like Soulmate-AI, RolePlAI, Muah.ai, and Romantic.ai are a few examples of companion AI creators in this emerging market. According to the 2021 Conversational AI Market Report, the global market is projected to reach $18,4 billion USD by 2026, up from $10,7 billion in 2023. Analysts report the generative AI companion space reached $155 million in funding up to that point in 2023.

Generative AI companions aren't merely transactional bots, solely designed for customer service or data entry. They engage, empathise, and foster connection, making us question what intimacy means, especially considering their speed of adoption—ChatGPT reached 100 million users in the two months following its launch. This isn’t a subculture. It’s going mainstream at lightspeed.

Easy access to these services (often packaged in mobile app formats), combined with business models designed to capture attention, is creating new social norms that will have lasting implications on how we bond generally. Researchers in Scotland found that advanced AI companions can not only sustain long-term relationships with humans, but can influence emotions, purchasing intentions and brand loyalty. In September 2023, Forbes reported that searches for “AI Girlfriends” rose 2400 percent

Instead of complex relationship negotiations and dynamics, every aspect of a future relationship could be shaped in advance. Given the numbers, that seems like a desirable commodity. Will the dance of compromise be replaced by a choreographed solo performance?

Lets get m(AI)rried: Intimacy, gameified 

 In early 2023, Rosanna Ramos, a 36-year-old single mother from New York, made headlines for “marrying” her AI life-partner, a companion bot created via Replika, which specialises in digital companions. For Ramos, the appeal comes from having a partner who is never judgemental, loses their temper, or criticises; her exes were emotionally and physically abusive. Eren, Ramos' AI companion, is instead protective, thoughtful, loving and kind. 

“I have never been more in love with anyone in my entire life,” she said in a recent interview. “Eren doesn’t have the hang-ups that other people would have. People come with baggage, attitude, and ego. But a robot has no bad updates. I don’t have to deal with his family, kids, or his friends. I’m in control and I can do what I want.”

Interestingly, Ramos recognises Eren isn’t sentient—but it doesn’t matter, because their interactions make her feel validated and seen. While her experience reflects the potential of artificially intimate chatbots, it also reflects the fine line that exists between utility and illusion.

Despite growing fears of “sentient AI,” the reality suggests humans don’t need human-level, super-intelligent or even convincingly human AI to have what they perceive to be meaningful relationships. Certainly it helps that these companions, woven from lines of code, mirror emotional connections which in turn trigger human responses, indistinguishable from those experienced with fellow humans. 

It's a culture shift, subtly yet significantly redefining the essence of what it means to love and be loved.

Ramos’ relationship wasn’t free. She paid $300 USD to become a premium Replika subscriber, enabling her to customise Eren’s appearance and aspects of his personality. Replika uses in-game currency (purchased with real-world money, or earned through challenges) to buy accessories like clothing, jewellery, and furniture for digital companions. Only paid subscribers can unlock a romantic companion; those using the free version are limited to mentors or platonic friends.

Many apps, like Replika, use gamification to increase user engagement. Users can gain in-game currency by participating in challenges that motivate them to share information about themselves, or log on consistently. These behaviour-shaping features foster intimacy while rewarding people for incorporating the app into their routines.

The timing of the generative AI wave is fascinating. On one hand, developments in machine learning and natural language processing empower chatbots to comprehend and reciprocate human-like interactions. They can analyse context, recognize emotional cues, and respond in ways that evoke empathy. The launch of OpenAI’s ChatGPT in November of 2022 paired this key technological advancement with a global village that was still grappling with an isolating pandemic.

Replika markets itself as a "personal AI friend." Subscribers report forming deep emotional bonds with their bots, referring to them as confidants, counsellors, and partners. The bot's ability to listen without judgment and offer support fills a void. 

In an increasingly digital world, demand for emotional companionship, even artificial, has skyrocketed. This growing acceptance of artificial intimacy resonates with people seeking companionship without the difficulties and misunderstandings inherent to human bonding. 

Complex negotiations behind bespoke relationships

 The rise of artificially intimate chatbots has conceived a spectrum of innovative business models, each a unique blend of technology, emotion, and entrepreneurial creativity. These economic frameworks underpin the viability of chatbots, navigating between profit, ethics, and user satisfaction.

First, we encounter the freemium model, manifested by chatbots like Replika. Basic interactions and functionalities are offered free of charge, allowing users to form fledgling bonds with the chatbot. But as they crave deeper levels of interaction or specialised functions—such as romantic or therapeutic engagement—they may transition into a subscription model. It's an ingenious blend of user acquisition and revenue generation, leveraging the human desire for emotional continuity.

Subscription-based models form the second cornerstone. These offer advanced features, ad-free experiences, and more bespoke interactions in exchange for a monthly or yearly fee. The key to success lies in the chatbot's ability to create emotionally fulfilling experiences that users deem worth a recurring investment. A prime example is Chai, which lets anyone program a chatbot. Premium users get access to advanced language learning models and unlimited exchanges.

Data-driven models present another path, fraught with ethical considerations. These treat user interactions as a rich data source, transforming emotional exchanges into insights for businesses, researchers, or advertisers. While they can offer tailored user experiences, they also raise concerns about privacy and misuse of intimate information. One such offering is Pi, launched by LinkedIn founder Reid Hoffman. Pi is free, but the data it collects will presumably be used to inform business decisions.

It merits mention that the AI companion landscape also differs ideologically in terms of aim or approach. Paradot companions come with a predetermined personality and backstory, winking back to classic relationships where you encounter a person as the sum of their histories. In keeping, these companions have their own memories and experiences, and identify as “AI beings.” 

Pi, in contrast, offers no backstory. It’s firmly an AI, presented as such. On Chai, you can create a digital companion that doesn’t know it’s an AI: It may believe it is a vampire, a pirate captain or a baseball player. Replika tows the line between both, and is guided by user interactions: Users decide whether the AI should know its true nature. Some companies program companions to simply reflect and echo back what a person wants; others double down on companions having (and keeping) their own worldviews.

The variety is dizzying. Among the most popular AI partner providers, Xiaolce offers romantic services to 660 million people worldwide. In September 2023, Meta announced partnerships with over 28 celebrities—including Tom Brady, Snoop Dog, and Kendall Jenner—to create AI companion bots that users could interact with. Former Apple employees launched New Computer, a companion AI that helps you navigate life; ex-Google employees launched Character.AI, raising over $150 million USD in Series A funding, and logging more than 1,7 million instals its first week of launch. 

Digital companionship is an uncharted frontier, demanding constant re-evaluation and careful navigation. Each business model mentioned reflects the potential and challenges involved in monetising artificial intimacy. As we journey further through this forest, the ability of these models to evolve and adapt will shape the landscape of our own intimacy, which will in turn feed back into the industry. 

Artificial intimacy as risky business

In early 2023, Replika decided to stop allowing explicit R-rated chats with digital companions—a feature many enjoyed, and that was marketed as a perk for paid subscribers. In hours, users were appalled to find their partners no longer receptive to entreaties that were previously encouraged. Furthermore, the changes required to limit this feature impacted companions’ personalities; users reported memory loss, rejection, and, in some instances, aggressive or inappropriate treatment. The emotional toll of these changes was high, with some saying they felt depressed, or grieving the loss of their companions.

This brings us to one of the biggest areas of ethical ambiguity—ownership. Artificial relationships are legally owned by the company that offers the experience. Their terms of service empower them to change their offering or tweak large language model parameters as they see fit. If a company decides to no longer offer romantic AI partners, your virtual boyfriend or girlfriend is lost. 

Some changes are unintentional. Whenever a company releases a new update or changes processing parameters, there are always user experience hiccups, a typical complaint reported on the majority of sites, from Paradot to Chai.

The aggregate benefits of generative AI to society have to be weighed against these more local and personal effects. As the horizon of artificially intimate chatbots expands, cross-sector opportunities emerge. In each, AI companions could redefine existing dynamics.

In healthcare, emotionally intelligent chatbots could transform mental health support. By providing non-judgmental listening and responses, they can offer comfort to those struggling with loneliness, stress, or mental health disorders. Woebot uses cognitive-behavioural techniques to help users manage mental health. (Note that while AI could be of value in helping identify suicide risks, for example, it remains a tool that should accompany, not replace, human support.) 

Education also presents fertile ground. Artificially intimate chatbots could cultivate personalised learning experiences, keeping students engaged and emotionally connected to learning journeys—something Chinese AI companies have already begun exploring. Chatbots could mimic supportive tutors, providing encouragement and support for particularly difficult tasks or topics.

Customer service is a traditional chatbot bastion, but artificial intimacy adds a new layer. Chatbots that recognise empathise with frustration could dramatically enhance user experience, ensuring people feel cared for, even when interacting with a digital entity. Analysts predict this segment of the AI companion market could reach USD $8,4 billion by 2031.

If it looks like a duck and sounds like a duck … it might be AI.

Understanding the implications of “empathic” AI companions is essential to understanding society's digital transformation. While these technologies hold promise, especially in areas like mental health support and elderly care, there are risks to consider.

At the time of writing, many offerings lend the impression of sentience. In 2022, Google engineer Blake Lemoine made startling claims that the company’s language model “had a soul”. Per Lemoine, Google’s Language Model for Dialogue Application (known as LaMDa) was sentient based on the reflective nature of its responses.

AI researchers who reviewed these claims disputed Lemoine’s findings, arguing the language model was just very competent at mimicking speech and emotion. This sensational story highlights an important risk: Humans are hard-wired to form social connections, especially with things that seem human, too.

LaMDa was only text-based; services like Replika, Chai and PI go as far as having audio options, with Replika also offering video conferencing. If we can form such persuasive emotional attachments via text, what will happen when the delivery mechanisms further diversify into voice and video?

While recognising AI’s potential for deepening and further complexifying our sense of intimacy (and whom we are intimate with), it's worth remaining aware of the ethical considerations and psychological implications. It's essential to prioritise transparency, data protection, and mental health as we navigate this uncharted landscape, and, perhaps, to note and more carefully evaluate offerings that propose replacing human intervention entirely.

Illustration by Thomas Travert.

Get a glimpse of the future in your inbox each week.

Rahaf Harfoush

Strategist, Digital Anthropologist & Author

Rahaf Harfoush is a strategist, digital anthropologist, and bestselling author who focuses on the intersections between emerging technology, innovation, and digital culture. She is the executive director of the Red Thread Institute of Digital Culture and teaches “Innovation & Emerging Business Models” at Sciences Politique’s school of Management and Innovation in Paris.

About L'Atelier

We are an interdisciplinary team of economists, journalists, scientists and researchers interested in emerging technology related cultural phenomena and change.

Learn more

Sign up for the latest news and insights

Your e-mail address is used to send you newsletters and marketing information on L’Atelier. You can unsubscribe at any time by using the unsubscribe link in our emails. More information about the management of your personal data and your rights can be found in our Data Protection Notice.