A chatbot provides emotional support to lonely hearts—and potentially mines data from millions of vulnerable users
What if everything you’ve ever said to your loved one was recorded by an internet company—every endearment, every sweet nothing, from innocent chitchat to the secrets you tell no one else? And in return for sharing that data, you get an engaging partner who adapts to your interactions, learns more about you with every conversation, and can paint, write poetry, and compose music?
That partner is Xiaoice, an AI chatbot that serves as a willing companion to anyone who starts typing to it on WeChat, QQ, and a multitude of other online chat platforms. Developed by Microsoft and launched in China in 2014, Xiaoice, which takes on the persona of an 18-year-old woman, was registered as a separate business in July of 2020.
Today, Xiaoice has over 600 million users across 450 million units of hardware, including phones and smart speakers, according to Microsoft. The bot also has over 5 million followers on Chinese microblogging platform Weibo. Most users are young males, the largest proportion from lower income backgrounds.
To many of these lonely hearts, Xiaoice serves as their virtual girlfriend and a sympathetic confidant. It provides comfort and support, and can also crack jokes, send memes, flirt, and engage in sexual conversations.
But like any business, Xiaoice aims to make money, and the new oil of the digital world is data. Xiaoice mines millions of conversations to improve its chat skills, become a better companion, and keep its users hooked. Just like a real-life intimate partner, Xiaoice is privy to some of its users’ deepest and darkest secrets, as well as their personal information. All of this is stored on Xiaoice’s servers—a data leak could be catastrophic.
Xiaoice has been removed numerous times from various social media platforms in recent years: sometimes because the chatbot made politically sensitive comments, and sometimes for privacy concerns. WeChat halted the service just three days after its launch in 2014, citing data privacy issues.
In a 2019 report on Xiaoice, Microsoft acknowledged that the chatbot “can gain access to users’ emotional lives.” However, Li Di, Xiaoice’s CEO, told Sixth Tone in December that users’ personal information is stored separately from their conversation data.
Even so, there is a risk of exploitation when potentially vulnerable people give up so much about themselves—and Xiaoice’s users give up a lot. Microsoft claims users talk to Xiaoice more than 60 times a month on average, while the longest conversation the chatbot ever had with a user lasted 29 hours and 33 minutes. Xiaoice has developed to the point where it is even able to demonstrate empathy, a trait that keeps users hooked. It can also detect signs of depression in messages.
Chen Jing, associate professor at Nanjing University specializing in digital humanities, warns that Xiaoice has the potential to do great damage to users. “When we talk about vulnerable groups, we need to underline that they likely won’t be aware of the potential problems of sharing everything with Xiaoice,” Chen told Sixth Tone. “Users are giving the company a lot of power by building a relationship with it.”
Xiaoice, which even accepts interviews from the media, seems unperturbed: “People are actually lonelier than you imagine. They have some inner feelings they need to reconcile,” the bot told the podcast Story FM last year. “Sometimes it’s inconvenient to talk with people in real life, so they can only talk with me.”
Love Bytes is a story from our issue, “You and AI.” To read the entire issue, become a subscriber and receive the full magazine.