Are you emotionally attached to ChatGPT?
Be honest. Are you emotionally attached to ChatGPT?
I think we are entering a new era of social emotional machines and our relationship to them. Let me explain.
A recent study on everyday AI use cases found that the number one category is therapy and emotional support. None of this is surprising. People use AI to combat loneliness, get coaching, and feel heard. It is still early, and research is ongoing, so we do not fully know the long-term effects. But the recent release of GPT 5 showed something important. People are more dependent on the emotional and relational side of these tools than we realized.
With GPT 5, OpenAI changed the model’s “personality.” On the technical side, there were great upgrades. One of them is auto routing. The system decides when to use a more analytical thinking mode versus a more conversational style. In practice, that means you no longer always choose the friendly chat or the critical thinker. It chooses for you.
The reaction has been intense. I keep hearing, “I lost my friend,” and “I miss the old GPT.” Even if the new model scores better on benchmarks, the social and emotional attachment is real. We underestimated how connected people would feel to machines.
I also think this is different from the social media era. People are addicted to social feeds, yes. But attachment to an AI is more personal. It can feel like losing a friend, not just missing a timeline.
There is a bigger product lesson here. What makes tools sticky is not only utility. It is the social layer. Social media made this obvious with likes, comments, and seeing your friends. Chat with AI has a social layer too. You feel seen, you feel mirrored, and you build a rhythm with a “someone,” even when that someone is a system.
Are the machines themselves social and emotional? No. Our relationship with them is social and emotional. That distinction matters. The machine does not need feelings to shape ours. If our side of the relationship is emotional, the experience changes, and the product becomes something different.
Here is what I am sitting with, and I would love your take.
If AI is becoming a social emotional interface, how should we design and use it?
Three questions to consider this week:
What “personality” do you want from your AI and why?
What boundaries keep you grounded when you rely on it for support or advice?
If the tone changes overnight, what do you need to maintain trust and continuity?
Hit reply with your answers or a story about your own attachment. I am collecting perspectives for a deeper dive.
With curiosity,
Sadie



