Internal studies from OpenAI and MIT have been exploring how chatting to ChatGPT may affect the emotional experience of its users. In recent findings - which draw on over 40 million ChatGPT interactions and four weeks of human observation - there were some interesting takeaways.

Credit: Shutterstock
The study from MIT found that chatbot interactions, whether through text or voice, were associated with higher levels of loneliness, reduced socialization, and emotional dependence. It highlights that participants who already trusted the chatbot, or were prone to emotional attachment in human relationships, were more likely to experience these effects. While the impact was less severe in ChatGPT's voice mode, the study found that both personal and general conversations could lead to emotional dependence and feelings of loneliness.
OpenAI's study, however, found that emotional conversations with ChatGPT were the exception rather than the norm. The researchers highlighted that such interactions were prominent for 'only a small group of the heavy Advanced Voice Mode users' that were studied. Meaning, that while there are links to adverse emotional effects found in the MIT study, they only made up a small subset of the broader population.
As a frequent user of ChatGPT's advanced voice mode, it's interesting to see a quantitative breakdown of how these interactions are effecting behavior on a macro scale. Advanced voice currently delivers an interactive experience that's impressively close to human - in both tone and clarity. But anecdotally, ChatGPT's tendency to lean into reflective mirroring (in essence, telling you what you want to hear) remains one of its key shortcomings. Even if a chatbot 'knows everything,' one that constantly molds itself to subtle cues in your language is never going to replace an honest, human perspective.
As LLM-based technology spreads across multiple professions - whether it's powering video game NPCs, or being used in clinical and therapeutic settings - research will continue to accumulate around the effects of prolonged, emotionally charged use. For now, users can keep these effects in mind when applying the tool in personal or professional contexts.