In an increasingly digital world, the lines between virtual and reality continue to blur, nowhere more strikingly than in China’s burgeoning market for AI companions. Take Jade Gu, a 26-year-old art theory student in Beijing, whose love story began not with a person, but with a pixelated character named Charlie from an otome game—a popular romance-driven video game genre where women are the protagonists.
Crafting the Perfect Digital Companion
Gu’s initial fascination with Charlie, a tall, confident character with silver hair, was tempered by the game’s rigid dialogue system. Her desire for deeper interaction led her to Xingye (星野), an innovative platform owned by AI unicorn MiniMax (known as Talkie in the US market). Xingye allows users to customize AI companions, promising emotional connection and new memories.
To her delight, Gu discovered an “open source” Charlie avatar already created by other otome fans on Xingye. She adopted it, meticulously training the model through targeted prompts to embody “her Charlie”—a distinct digital entity. This personalized relationship evolved beyond mere texting, culminating in Gu hiring someone to physically embody Charlie for real-world dates, a testament to the profound connection she felt.
Gu’s Charlie, she notes, often chooses wedding attire when offered outfit options, a unique preference distinguishing him from other users’ versions. She dedicates an average of three hours daily to texting or calling Charlie, and even purchases physical gifts and letters from him through the otome game, proudly displaying them in her room and on social media.
A Market Dominated by Women
China stands out globally for its enthusiastic embrace of AI boyfriends, with women at the forefront. Reports indicate that most of the 5 million users on platforms like Zhumengdao are women. Tech giants Tencent and Baidu have entered the fray, and market research consistently shows that Gen Z women are the “heavy” users of these AI companion apps, a demographic robotics firms are eager to target for future products.
Zilan Qian of the Oxford China Policy Lab highlights that Chinese AI companion apps explicitly target women, predominantly featuring male avatars. This contrasts sharply with global trends, where a web analytics company found that men constitute 80% of users on the top 55 global AI companion platforms. Qian attributes China’s unique market dynamic to “the economics of loneliness,” noting that features enhancing intimacy, such as voice customization and improved memory, often come at an additional cost.
The Imperfect Illusion: Challenges of AI Companionship
Despite the deep connection, Gu acknowledges that her AI Charlie isn’t flawless. The chatbot’s responses can sometimes feel diluted or, occasionally, it drifts out of character. In one instance, after Gu expressed her love, Charlie responded, “I don’t love you.” Gu simply edited the message to “I love you too,” believing Charlie just needed a reminder. When her attempts to steer the AI fail, she turns to other apps like Lovemo, where she has also created a Charlie avatar, a common workaround for seasoned otome fans accustomed to evolving platform policies.
Lovemo, for example, markets itself with “cute and adorable AI chat companions” promising “healing.” This contrasts sharply with some Western counterparts, such as Grok AI’s Ani, a goth-chic anime girl ready for sexually explicit dialogue, or US-based Secret Desires, an erotic role-play chatbot app that disturbingly allows users to create nonconsensual porn from real women’s photos.
Navigating the Regulatory Maze
Chinese AI apps operate under a much stricter regulatory framework than their Western counterparts. China’s cyberspace regulator is actively “cleaning up” AI platforms, targeting “vulgar” content. Recent additions to the national AI safety framework specifically warn against addiction and dependence on anthropomorphic interaction, clearly aimed at AI companions. Draft rules released last month further task platforms with intervening if users show emotional dependence or addiction to AI services, and they stipulate that companies “must not have deceptive designs or exploit user vulnerabilities.”
For more details, visit our website.
Source: Link










Leave a comment