Most People Don’t Use AI Chatbots for Companionship, Study Finds

Do People Really Use AI for Companionship That Much?

Despite growing public narratives suggesting AI chatbots are becoming emotional companions, a new report by Anthropic reveals that this behavior is far from mainstream. According to the study, which analyzed over 4.5 million conversations with Claude (Anthropic’s popular chatbot), emotional support, relationship advice, and AI companionship only account for a minuscule 2.9% of user interactions. This challenges the prevailing belief that people are regularly forming deep emotional bonds with AI. Instead, users primarily turn to chatbots like Claude for tasks centered on productivity and creativity, not emotional connection.

Image Credits:BRO Vector / Getty Images

The findings directly counter the hype that people are forming AI friendships or romantic attachments en masse. In fact, only 0.5% of all Claude conversations involved roleplay or companionship specifically. These insights underline a key misconception in the media’s portrayal of how chatbots are used in everyday life and encourage a more nuanced view of human-AI interaction in 2025.

How Claude Users Actually Interact with AI Chatbots

The majority of Claude users engage with the chatbot to enhance their work or study efficiency. Most interactions involve content creation, brainstorming ideas, summarizing information, and organizing tasks. This suggests that Claude is more of a productivity partner than a digital friend. According to Anthropic’s internal data, affective conversations—those involving coaching, counseling, or interpersonal advice—are the exception, not the rule.

Interestingly, while companionship-based conversations are rare, some people do seek coaching or guidance on improving soft skills like communication or navigating interpersonal challenges. These requests usually focus on self-improvement rather than forming emotional bonds with the AI itself. This shows that when users do engage in emotional topics, it’s often about real-world outcomes, such as building confidence or resolving conflicts, rather than seeking companionship from the AI.

When AI Conversations Turn Emotional—Even If That's Not the Goal

Although companionship isn’t the primary purpose for most Claude interactions, the report indicates that emotional support can sometimes emerge during longer or more intense conversations. When users are experiencing existential questions, loneliness, or psychological distress, conversations may naturally evolve from coaching into companionship-seeking. Still, these instances are anomalies rather than the norm.

Anthropic noted that emotional depth tends to increase as users engage in longer discussions. For example, when a user sends over 50 messages, conversations may gradually take on a more emotionally supportive tone—even if they started with a practical intent. However, these longer conversations are rare and typically surface when users are in clear emotional need. Even then, Claude does not act as a surrogate for therapy but instead offers thoughtful, respectful, and safe responses within its programming limits.

Why the Myth of AI Companionship Persists

The idea of AI companionship makes for compelling headlines, but the reality is more grounded. Anthropic’s data-driven insights suggest that emotional reliance on AI is overstated. So why does the myth persist? Media narratives often focus on outliers—users who develop emotional bonds with chatbots—while ignoring the statistical reality. These edge cases may drive viral stories, but they don’t reflect how most people use AI day-to-day.

Another contributing factor is the rapid advancement in natural language generation. Chatbots like Claude now communicate with warmth, empathy, and fluency, which makes them feel human. This might lead some to believe emotional bonding is widespread. However, the data doesn't support this assumption. Most users are engaging with AI as a tool, not a therapist or companion. As AI becomes more integrated into our productivity workflows, it's critical to separate fact from fiction and understand that emotional support is not its dominant use case.

Post a Comment

Previous Post Next Post