ACCORDING TO BUSINESS INSIDE, AI SEARCH ENGINE Perplexity CEO Aravind Srinivas In a conversation organized by the University of Chicago, Polsky Center, concerns were expressed about the rapidly emerging AI companion-type application。

He pointed out that such applications, through voice-interactive and humanized design, could embroil users into "synthetic reality" or even replace real-life experiences。
Srinivas emphasizes that the high level of personalization and memory functions of AI to accompany applications that allow it to simulate human interactions in natural voice communication is a trend "in itself dangerous"。
He warned that many people might spend a lot of time in a virtual environment because they feel that real life is more boring, leading to “extreme mental vulnerability”。
At the same time, he made it clear that perplexity would not develop such chat robots, but rather would like to build a more promising future through "credible sources and real-time content"。
Just last week, Perplexity announced a $400 million partnership with Snap, an American social media company, to provide an AI search service for Snapchat in early 2026. Snap indicates that users will be able to obtain a conversational answer based on verifiable information in the application。
AI The rapid expansion of accompanying applications has become the focus of industry controversy. Virtual partnership services are being introduced in companies including XAI, Replika and Charactor.AI。
when the Grok-4 model was released in July of this year, XAI introduced a 30-dollar monthly virtual "friend" that allows users to interact with animated-style role or humanized animals。
THE REPORT ALSO REFERS TO RESEARCH DATA THAT 72% OF 13 TO 17-YEAR-OLDS IN THE UNITED STATES HAVE USED AI ACCOMPANIMENT APPLICATIONS, OF WHICH 52% HAS BEEN USED AT LEAST SEVERAL TIMES A MONTH。
Critics believe that such applications may deepen dependence, blur emotional boundaries and even reinforce gender stereotypes. But some users say that experience is "very meaningful." For example, in an interview with a Grok user, it was said that when interacting with a virtual character, it was "frequently tears" and felt real emotions。