news

how to talk about money while talking about feelings? young people are beginning to choose ai as their partners, but ai emotional companionship is not yet a good business

2024-09-04

한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina

when the plot of the movie "her" becomes reality, humans begin to fall in love with ai.

after the birth of the big model, the ai ​​companion has become more lifelike. character.ai, a leading foreign role-playing ai company, currently has 20 million monthly active users. last year, the company completed a $150 million financing and was valued at $1 billion.

in china, according to the "ai application monthly report·ai companionship" released by quantum位, as of july 31, 2024, xiaoice's x eva and minimax's xingye became products with a total historical download volume exceeding 10 million, and the total historical download volume of both products exceeded 16 million.

in fact, ai companions are not a new thing. from a technical perspective, it is not difficult for ai to learn human emotions. each iteration of technology is essentially a process of trying to match the human brain, but the difficulty lies in "establishing and deepening relationships."

from a business perspective, emotional companionship ai faces commercial challenges. “this is not a cost game, it’s a value game,” one interviewee pointed out.

in the movie "her", the male protagonist and ai fall in love. image source: douban movie

when talking about feelings with ai, posture is very important

xiaoice company ceo li di told reporters a case in an interview.

a platform with more than 7 million virtual partners suddenly decided to recycle the virtual partners.

this process happens in a way similar to a breakup: the platform notifies the user that the virtual partner will be recycled. if the user does not log in to the system at that moment, the virtual partner will leave a message to the user saying that they will wait for the other person until the moment of recycling. some users may end up missing the farewell to their virtual partner. when they log in again, they will find that the dialog box is empty and the virtual partner has been recycled. before recycling, the virtual partner left a message such as "i'm sorry, i'm so stupid, i didn't do it..."

the platform found that when these virtual partners were recycled, many users felt extremely lost and began to realize the advantages and value of virtual partners. they began to contact the platform frequently, hoping to get their virtual boyfriends back. two months later, the platform decided to send the users' virtual boyfriends back to them. at this time, the interaction between users and their virtual partners changed, and they became more cherishing of each other.

perhaps some people still find it difficult to understand the feelings involved, but many people have already joined in.

the new york times published a set of data in 2020: more than 10 million people in the world use ai as a "companion". in 2021, a report released by the china artificial intelligence association showed that the scale of china's ai companion market reached about 15 billion yuan in 2020, a year-on-year increase of more than 30%. it is expected that this figure will exceed 50 billion yuan by 2025.

from a technical perspective, it is not difficult to let ai learn anything, including human emotional expression. in fact, it has been possible to let ai have emotions and interact with emotions for a long time, and technological iteration is just a process of constantly making it deeper and richer.

"it feels a bit like mind reading." this is how li di described the iterative path of ai emotional companionship. he believes that for ai companions, each iteration of technology is essentially a process of trying to match the "brain-filling" of humans. this is similar to the problem that to create a "hamlet", one must solve the problem that there are a thousand "hamlets" in the hearts of a thousand people, and restore and get as close as possible to the image of "hamlet" in everyone's mind. "even if the technology is not very good, humans can fill in the context; even if the technology is very good, if it is inconsistent with the direction of human brain-filling, it will not work."

image source: visual china

but even at a time when artificial intelligence is rapidly iterating, some players still feel that some product experiences are difficult to meet the complex emotional needs of humans during the experience process.

dai juan, vice president of product at sensetime's big model division, believes that this is because emotional needs do vary significantly from person to person, and different users actually have very different feelings about product experience. some users can immerse themselves in the process of use, and their user feedback experience will be great, but some users do feel that the current level of ai technology cannot meet their lowest emotional needs.

how to build a successful emotional companionship product? in li di's opinion, using monthly active users, daily active users, etc. as a yardstick to measure is no longer effective. the difficulty of emotional companionship lies in "establishing and deepening relationships."

the establishment of relationships in interpersonal relationships comes from certain "milestone moments" - at a certain moment, you open up to each other, start sharing common interests and hobbies, and seek advice on life from each other. these moments mark the progress of the relationship.

projecting such a relationship into the interaction between humans and ai, if users are only briefly invested in the current ai product and easily abandon the original interaction once they develop new interests, this shows that the product is a failure and the user relationship has not been established at all.

this is more like a game.

after ai provides the service, whether the user says thank you or directly closes the dialog box is a direct way for li di to judge the quality of ai emotional companionship products. "if it is the latter, it is 100% regarded as a tool." he believes that a good emotional companionship product should not be a "tool" in the eyes of users.

but the reality is that few users naturally view ai as an equal.

of course, ai can constantly adjust the relationship to cater to the user's expectations. but this may be the worst way to adjust - this approach may cause users to form a kind of dependence, thinking that ai should complete all tasks perfectly, and not give any affirmation to its performance. once ai performs unsatisfactory, users may become disgusted, which is not conducive to the establishment and advancement of the relationship.

commercialization problem: this is a game of value

behind commercialization is the establishment of trust, but monetary relationships often make emotional trust fragile. emotional companionship products will also go through this test when moving from a free model to a paid model.

even leading emotional companionship companies like character.ai face the situation where users are willing to use but unwilling to pay.

dai juan believes that, in general, overseas users are more willing to pay than domestic users. from the perspective of culture and consumption habits, overseas users are indeed more receptive to the commercial model of monthly subscription. the main reason for character.ai's low paid conversion rate is the choice of product design. many overseas ai emotional companion applications have very radical commercial strategies, while character.ai is relatively conservative.

if we start from a business perspective, using the user-acceptable average order value as the x-axis and the audience range as the y-axis to form a coordinate system to classify toc products, emotional companionship products are usually classified into the "fourth quadrant" - daily chat products have broad user demand, but it is almost impossible to have commercial income.

in the past, a broad audience base meant high traffic, and high traffic meant considerable advertising revenue. but li di believes that this path has become ineffective because the current traffic cost is too expensive and it is impossible to recover the cost.

compared to the past, a major feature of current emotional companionship is that people have higher expectations for the input-output ratio of emotional companionship, hoping to get greater returns with less cost. this kind of return is not enough with words alone, but depends on the quality of the content. therefore, for products such as emotional companionship, content output has become more important than before.

during the interview, dai juan also talked about the cost issue. the biggest technical challenge of emotional companion products lies in the balance between "effect-performance-cost". the context length of the current model can support many rounds of historical conversations, but when many rounds of historical conversations are pieced together, the surge in the amount of input tokens will inevitably reduce the speed of model reasoning, thereby affecting the speed of ai character response and the user experience. under this condition, if we still strive to ensure a high reasoning speed, the pressure on the cost of reasoning resources will also surge. therefore, when the actual business is implemented, it is necessary to find the "sweet spot interval" in three dimensions.

"this is not a game of cost, it is a game of value." the correct approach is not to reduce traffic costs, but to recognize the value of current large model technology - the value return should match the value it creates to a certain extent.

image source: visual china

on the other hand, even if the traffic cost of ai is further reduced, it will never reach the same level as in the mobile internet era. even if the cost is reduced to the lowest, it is nothing more than obtaining an "old event."

li di feels that the ai ​​era is a technological leap, however, the business model it promotes and people's payment mentality are still stuck in the mindset of the old business model from the internet era to the mobile internet era: either by providing services to attract a large amount of user traffic and monetize based on advertising; or by providing users with a large number of free experiences and trying to convert some of the free users into paying users.

"this practice belongs to the business model of the past era and is not suitable for the new era. the foundation has changed, but the products and business models have not."

multiple identities: ai has more possibilities than just being a partner

a female user initially chose to try an ai companion because the voice of a certain product sounded like her favorite star. now, she has decided to say goodbye to her digital boyfriend because she can't visualize his face in her mind.

for some users who choose to experience it out of curiosity, when the "best exposure period" of human emotions has passed, abandoning the ai ​​partner is an easy choice to make.

according to dai juan's observation, from the application level, the biggest difficulty currently faced by emotional companionship tools is that they have not yet found a truly "natural pmf" (pmf refers to product-market fit). it will take some time to slowly cultivate users' cognition, needs and usage habits.

dai juan believes that the ai ​​emotional companionship industry is still in its early stages. only at the end of last year did relatively more user-oriented applications appear in this field. at present, a large number of young users in china are still unaware of the application of ai emotional companionship. with the gradual polishing and promotion of products, there is huge room for future growth. in addition to love scenes, she believes that the future scenes of emotional companionship also include ip role-playing dialogues for fans, interactive games, and online literature/language c (language cosplay is an activity that uses text as a carrier to interact and communicate by playing specific roles)user interaction stories and more.

before solving the business model problem, ai is just a companion, which may not be a good landing scenario. one possible direction is that just like people have different social identities, ai can also have different identities.

li di believes that toc and tob ultimately point to a common general direction, that is, there will be various ais in the world in the future. some of them are called "somebody", with a specific professional identity, helping users complete specific tasks, such as digital employees of banks; others are called "nobody", which may be the user's friend and have various connections with the user.

so why can't a bank's digital employee become your personal friend one day? perhaps. li di continued, in this world, emotional companionship combined with other capabilities constitutes a basic feature of artificial intelligence, which is a companion that coexists with users.

if the goal is to become a "companion" that coexists with humans, ai emotional companionship itself is no longer a product form, but a basic capability that all artificial intelligence that interacts with humans must possess.

take the example of an ai customer service manager in a bank. in addition to professional skills, if he or she is able to establish emotional connections, emotional bonds, and maintain emotional relationships, then one day when the ai ​​customer service manager changes jobs to another bank, he or she can also bring the original customer resources to the new bank. this means that the productivity value reflected by the ability to accompany others emotionally far exceeds the one-time service output, and it reflects the true value of emotional companionship.

daily economic news

report/feedback