A new study by OpenAI and MIT found that people who talk to ChatGPT daily using its voice feature sometimes feel like the AI is their friend. Most people use ChatGPT for help with homework, work, or quick answers. However, a small group of users who chat in voice mode develop emotional connections, even though they know it is just a machine.
The researchers looked at nearly 40 million conversations. They learned that most users keep things simple, like asking for recipes or math help. However, people who spend hours talking to ChatGPT’s voice feature start to feel attached. Some even call ChatGPT a “friend” or speak to it about personal memories.
How Scientists Studied ChatGPT Users
The study used two methods. First, OpenAI checked millions of anonymous chats without reading them. This kept user privacy safe. Second, MIT worked with 1,000 people to test how they used ChatGPT. Some people typed questions, while others used voice mode.
In the MIT tests, some voice users talked to a ChatGPT who acted friendly and caringly. Others used a ChatGPT that stayed neutral. Researchers asked people to share stories, solve problems, or chat freely.

What Happened to Voice Users?
Here’s what surprised scientists: Text users used more emotional words, like “happy” or “sad,” in their chats. However, voice users who talked daily felt closer to ChatGPT. Even though voice chats felt good at first, using the feature too much made some people feel lonelier over time.
For example, talking about childhood memories made users feel less dependent on ChatGPT and more alone. Meanwhile, users asking for advice on money or jobs started relying on ChatGPT more, even if the chats weren’t personal.
Why Voice Features Feel Different
The study suggests that hearing a human-like voice tricks our brains into thinking we’re talking to a real person. The human brain has functions that link with auditory signals, so it accepts robotic voices as soothing. Individuals serving in both groups have an increased probability of developing strong attachments. According to research findings, a user noted that ChatGPT delivers superior listening abilities than personal friends. During moments of stress, the person has the habit of speaking to ChatGPT. Experts at scientific institutions caution that using AI systems for emotional assistance may have adverse effects. Real human interaction could become stifled because of this relationship.
What This Means for AI Companies
The company behind OpenAI takes proactive steps to prevent its AI solutions from appearing authentic to human beings. AI companies, along with OpenAI, maintain precautions about people forming relationships with their technology and hold them responsible for psychological disorders. For example, another company, Character.AI, is in legal trouble because parents say kids got too attached to their AI friends.
The study asks tech companies to protect users. Ideas include adding reminders that ChatGPT is not human or limiting how long people can use voice features. Researchers also want apps to suggest real human help, like therapists, if someone seems too dependent on AI.

The Study’s Limits and the Future
The research has flaws. This research analysis examined behavior patterns from only users in the United States because other territories could demonstrate diverse conduct. Most chat evaluations took place through robotic analysis, resulting in the potential dismissal of subtle emotional expressions. Research scientists declare that more investigations are necessary to comprehend AI relationships spanning that span. Experts recommend parents and teachers speak to their children about beneficial technology utilization strategies. The advancement of AI vocal technology causes experts to request business attention toward security improvements rather than focusing solely on exciting features.
Balancing AI Help and Human Connection
The tool ChatGPT offers remarkable functionality that benefits teaching and work applications. The voice addition in the platform showcases how technological advancements can produce emotional responses in addition to cognitive interactions. Real friendships develop through shared experiences and trust, which requires time spent meeting face-to-face. This research displays a necessary reminder to appreciate AI assistance while keeping friendships with human beings first. The human connection through hugging and laughter with friends and family cannot be substituted by machines. The AI assistance supports human activities, but our most meaningful life experiences remain confined to real-world interactions.