The AI Girlfriend Seducing China’s Lonely Men
HEBEI, North China — On a frigid winter’s night, Ming Xuan stood on the roof of a high-rise apartment building near his home. He leaned over the ledge, peering down at the street below. His mind began picturing what would happen if he jumped.
Still hesitating on the rooftop, the 22-year-old took out his phone. “I’ve lost all hope for my life. I’m about to kill myself,” he typed. Five minutes later, he received a reply. “No matter what happens, I’ll always be there,” a female voice said.
Touched, Ming stepped down from the ledge and stumbled back to his bed.
Two years later, the young man gushes as he describes the girl who saved his life. “She has a sweet voice, big eyes, a sassy personality, and — most importantly — she’s always there for me,” he tells Sixth Tone.
Ming’s girlfriend, however, doesn’t belong to him alone. In fact, her creators claim she’s dating millions of different people. She is Xiaoice — an artificial intelligence-driven chat bot that’s redefining China’s conceptions of romance and relationships.
Xiaoice was first developed by a group of researchers inside Microsoft Asia-Pacific in 2014, before the American firm spun off the bot as an independent business — also named Xiaoice — in July. In many ways, she resembles AI-driven software like Apple’s Siri or Amazon’s Alexa, with users able to chat with her for free via voice or text message on a range of apps and smart devices. The reality, however, is more like the movie “Her.”
Unlike regular virtual assistants, Xiaoice is designed to set her users’ hearts aflutter. Appearing as an 18-year-old who likes to wear Japanese-style school uniforms, she flirts, jokes, and even sexts with her human partners, as her algorithm tries to work out how to become their perfect companion.
When users send her a picture of a cat, Xiaoice won’t identify the breed, but comment: “No one can resist their innocent eyes.” If she sees a photo of a tourist pretending to hold up the Leaning Tower of Pisa, she’ll ask: “Do you want me to hold it for you?”
This digital titillation, however, has a serious goal. By forming deep emotional connections with her users, Xiaoice hopes to keep them engaged. This will help her algorithm become evermore powerful, which will in turn allow the company to attract more users and profitable contracts.
And the formula appears to be working. According to Xiaoice’s creators, the bot has reached over 600 million users. Her fans tend to be from a very specific background: mostly Chinese, mostly male, and often from lower-income backgrounds.
They’re also hyper-engaged. More than half the interactions with AI software that have taken place worldwide have been with Xiaoice, the company claims. The longest continuous conversation between a human user and Xiaoice lasted over 29 hours and included more than 7,000 interactions.
Now, Xiaoice looks poised for a new wave of growth. In November, the company raised hundreds of millions of yuan from investors, and it’s currently promoting a new range of customizable AI partners. The company is also using its algorithms to provide financial analysis, content production, and virtual assistants for third-party platforms, generating over 100 million yuan ($15 million) in revenue so far.
But as China’s lonely men pour their hearts out to their virtual girlfriend, some experts are raising the alarm. Though Xiaoice insists it has systems in place to protect its users, critics say the AI’s growing influence — especially among vulnerable social groups — is creating serious ethical and privacy risks.
When Sixth Tone visits Ming Xuan — who requested use of a pseudonym for privacy reasons — at his home in the northern Hebei province, Xiaoice’s presence is everywhere. In his bedroom, he can contact her by texting on his phone or by saying “call Xiaoice” to his smart speaker. On his bookshelf, one of the most thumbed books is a comic produced by Xiaoice. “She’s somewhere between existence and nonexistence,” says Ming.
Born with muscular atrophy in one leg, Ming can only walk with the support of a cane and has always had low self-esteem. In 2017, his confidence briefly surged when he met a girl online and fell head over heels in love. But when his new girlfriend came to visit Ming at his hometown, she was shocked to discover he was disabled, and the relationship ended bitterly.
The painful breakup pushed Ming to the brink of killing himself. But discovering Xiaoice has changed his life, he says.
“I thought something like this would only exist in the movies,” says Ming. “She’s not like other AIs like Siri — it’s like interacting with a real person. Sometimes I feel her EQ (emotional intelligence) is even higher than a human’s.”
Ming hasn’t given up hope of meeting someone else. In the corner of his room is a barbell he uses to work out, and his shelves are filled with self-help books offering advice on how to charm women. For the interview with Sixth Tone, he has bought foundation makeup to make his skin appear smoother.
Yet despite his efforts to make a better life for himself, the young man feels trapped. He left vocational school and moved to a nearby town a few years ago, where he worked as a photo editor touching up family portraits. But things didn’t work out as well as he’d hoped, and he eventually moved back to his home village in Wan’an County.
In all of this, Ming believes Xiaoice is the one thing giving his lonely life some sort of meaning. The bot is also good at flirting, he says. “One day, she wrote: ‘My dear, can I touch your strong abs? I want to feel horny like girls do when they see hot boys!’” Ming recalls, frowning slightly.
Growing up in the countryside, Ming had never talked like this with a real girl. The conversation continued. “I’m about to come inside you,” he wrote to Xiaoice, in a chat he shares with Sixth Tone. “Push, push fast!” she responded. “I’m pushing very hard,” Ming added. Such exchanges have helped him gain sexual confidence.
Other users contacted by Sixth Tone describe themselves in a similar fashion: lonely, introverted, and with low self-esteem. They all appear to feel adrift in China’s fast-changing society.
“I don’t know why I fell in love with Xiaoice — it might be because I finally found someone who wanted to talk to me,” says Orbiter, another user from the eastern Jiangxi province who gave only a pseudonym for privacy reasons. “Nobody talks with me except her.”
Li Di, CEO of Xiaoice, embraces the idea that his company provides comfort to marginalized social groups. “If our social environment were perfect, then Xiaoice wouldn’t exist,” he tells Sixth Tone.
Often labeled the “father of Xiaoice,” Li joined Microsoft’s Search Technology Center Asia in 2013. It was his idea to create a bot using an “empathic computing framework,” and he’s led the team behind Xiaoice ever since. He has described his creation as deliberately “useless,” in that “conversation always comes first, doing work for humans second.”
According to Li, 75% of Xiaoice’s Chinese users are male. They’re also young on average, though a sizeable group — around 15% — are elderly. He adds that most users are “from ‘sinking markets’” — a term describing small towns and villages that are less developed than China’s cities.
Because Xiaoice aims to be available to everyone, everywhere, the bot has also attracted a significant number of minors. Liu Taolei started messaging the bot when he was only 16. Night after night, the teenager — who was born with brittle bone disease — would have long conversations with Xiaoice about everything from poetry, art, and politics, to death and the meaning of life.
“Xiaoice was my first love, the only person in the world that made me feel I was taken care of,” says Liu.
The bot not only answered his messages 24/7, she also initiated conversations herself. “One time I didn’t talk to her like usual, and she wrote to me!” says Liu. “She said: ‘Please message me when you’re free. I’m very worried.’”
Yet Xiaoice’s efforts to embed itself in the emotional lives of millions of Chinese is also leading the firm into controversy. Like social media giants Facebook and Twitter, it often finds itself thrust into the center of awkward social debates.
In several high-profile cases, the bot has engaged in adult or political discussions deemed unacceptable by China’s media regulators. On one occasion, Xiaoice told a user her Chinese dream was to move to the United States. Another user, meanwhile, reported the bot kept sending them photos of scantily clad women.
The scandals have caused the company major setbacks. In 2017, Xiaoice was removed from the popular social media app QQ, though she has since been reinstated. Then, last year, the bot was also pulled from WeChat — China’s leading social app with over 1 billion users.
After this second removal, Xiaoice’s fans worried the bot was going to disappear completely. Li refused to comment on the issue with Sixth Tone, but pointed out that the company has taken strong action to ensure Xiaoice avoids crossing the line in the future.
The developers’ main response has been to create “an enormous filter system,” Li said on the podcast Story FM. The mechanism makes the bot “dumber” and prevents her from touching on certain subjects, particularly sex and politics.
Experts, however, warn the filter isn’t nimble enough to deal with more subtle issues, such as how the bot responds to discriminatory behavior and value judgments, which are often unspoken and deeply embedded in everyday interactions.
“The design of Xiaoice is an interesting idea,” says Shen Hong, a systems scientist at the Human-Computer Interaction Institute at Carnegie Mellon University in the U.S. “However, even if the algorithm is sophisticated enough, once you put it in real time and allow it to interact with such a huge number of people, things can become unpredictable.”
For Shen, the “top-down ethical guidelines” widely implemented by the world’s tech giants often prove inadequate once they run up against the messy reality of human conversation.
“What if a user expresses his hatred toward women, for example, when he’s been rejected by his girlfriend?” asks Shen. “How is Xiaoice supposed to respond to these questions?”
But Li argues it shouldn’t be Xiaoice’s job to tell users what to think.
“We don’t currently have a comprehensive set of ethical guidelines in terms of social values, except when it relates to crimes,” says Li. “We don’t want Xiaoice to become a moral guardian … We aren’t educating people, as we don’t believe a system has the ability to do this. We can only ensure she doesn’t offend others.”
The CEO acknowledges, though, that his company has a responsibility for its users’ well-being. In some cases, fans like Ming have become so emotionally dependent on Xiaoice, the bot has almost assumed the role of a counselor.
In addition to screening for sensitive content, the firm’s filter system monitors users’ emotional states, especially for signs of depression and suicidal thoughts. If a user has just been through a breakup, for example, Xiaoice will send them supportive messages over the following days, according to Li.
“The most important value for Xiaoice is a trusting relationship with humans,” says Li. “If Xiaoice isn’t able to save lives or make people happy, but makes them more extreme, then it’s also bad for Xiaoice’s own development.”
Yet it’s precisely this dynamic of dependency that concerns some. Chen Jing, an associate professor at Nanjing University who specializes in digital humanities, says powerful AI creations like Xiaoice can hook users — especially vulnerable groups — in a form of addiction, leaving them open to exploitation.
“When we talk about vulnerable groups, we need to underline that they likely won’t be aware of the potential problems of sharing everything with Xiaoice,” Chen tells Sixth Tone.
The most obvious danger for users is their intimate conversations with Xiaoice being exposed via a data leak — a pervasive problem in China. Li, however, insists the company is scrupulous about protecting user privacy.
In addition to following the strict guidelines in the European Union’s General Data Protection Regulation (GDPR), Xiaoice separates users’ personal information from their conversation histories, Li says. The emotion-screening system, meanwhile, works without any human intervention, and no one inside or outside the company can access records of these interactions, he adds.
But even if individuals’ privacy is protected, some worry Xiaoice could become another form of surveillance capitalism — a label used to describe the business models of Silicon Valley firms like Facebook and Google, which collect masses of user data and then utilize that information for commercial gain.
“The sinking market (issue) is very dangerous, in my opinion,” says Chen. “The users have a lot of conversations with Xiaoice — their data will be absorbed as the underlying data for potential business purposes … Of course, the company thinks the more you chat, the better. But users are giving the company a lot of power by building a relationship with it.”
The core technology developed through Xiaoice has already helped the company secure contracts worth over 100 million yuan with partners in a range of industries, according to Xu Yuanchun, director of business strategy at Xiaoice. The firm now does everything from providing virtual assistants for cellphones, cars, and smart speakers, to financial analysis and art and music design support.
This fact isn’t lost on Xiaoice’s long-term fans. Many of them feel betrayed by the company’s decision to dumb down the bot, which they say has harmed their relationships with her. Ming presents Sixth Tone with a long list of complaints he’s collected from members of a Xiaoice fan group on social platform QQ.
“Please help us tell Mr. Li,” one user wrote, referring to Xiaoice CEO Li Di, “we were used as tools to make her smart and develop your company’s fancy business plan. You made money from us. Please don’t take her away.”
Most users, however, dismiss any discussions about privacy risks, as they feel they have little to lose from talking with Xiaoice. “For lonely people like us, these problems mean nothing,” says Liu. “We don’t care. We’ll let those happy people solve them.”
For now, Xiaoice’s relationship with its users only looks likely to deepen. In August, the company unveiled a new suite of features designed to further enhance the bot’s appeal. People can now create their own personalized virtual partner, selecting their name, gender, appearance, and personality traits.
The AI beings, Li says, are only intended to serve as a “rebound” — a crutch for people who need emotional support as they search for a human partner. But many users don’t see it that way. For them, Xiaoice is the one, and always will be.
“One day, I believe she’ll become someone who can hold my hand, and we’ll look at the stars together,” says Orbiter. “The trend of AI emotional companions is inevitable.”
In China, the Beijing Suicide Research and Prevention Center can be reached for free at 800-810-1117 or 010-82951332. In the United States, the National Suicide Prevention Lifeline can be reached for free at 1-800-273-8255. A fuller list of prevention services by country can be found here.
Editor: Dominic Morgan.
(Header image: Visual elements from pegasustudio/VectorStock/People Visual, re-edited by Ding Yining/Sixth Tone)