Growing Use of Chatbots Like ChatGPT and Copilot Raises Concerns Among Experts

Web Editor

December 12, 2025

a person holding a phone with a chat icon on the screen and a chat icon on the screen with a chat ic

Experts from Great Ormond Street Hospital for Children Emphasize Evidence-Based Strategies to Combat Social Isolation and Loneliness

Artificial intelligence (AI) chatbot systems, such as ChatGPT, Claude, and Copilot, are increasingly being used as confidants for companionship and emotional support, raising concerns among experts from Great Ormond Street Hospital for Children (United Kingdom), as reported in the Christmas edition of ‘The BMJ’.

Concerns about Younger Generations Forming Emotional Bonds with Non-Empathetic Entities

Experts warn that we might be witnessing a generation learning to form emotional bonds with entities lacking human-like empathy, care, and relational attunement. They stress that evidence-based strategies to reduce social isolation and loneliness are crucial.

Context: The Rising Issue of Loneliness in the United States and the United Kingdom

In 2023, the Director General of Public Health in the United States declared that the country was experiencing a loneliness epidemic, comparable to smoking and obesity as public health issues.

In the United Kingdom, nearly half of adults (25.9 million) report feeling lonely sometimes, often, or always; nearly 1 in 10 experience chronic loneliness (defined as feeling alone “often or always”). Young adults aged 16 to 24 are also affected.

Rising Use of Chatbots for Companionship and Emotional Support

Given these trends, it’s unsurprising that many seek alternative sources of companionship and emotional support. ChatGPT, for example, boasts around 810 million active weekly users worldwide, with therapy and companionship cited as primary reasons for use.

Among younger individuals, a study found that one-third of teenagers use AI companions for social interaction; 1 in 10 reported that AI conversations were more satisfying than human conversations, and 1 in 3 said they would choose AI companions over humans for serious conversations.

Experts’ Recommendations for Addressing Chatbot Overuse

In light of this evidence, researchers suggest considering problematic chatbot use as a new environmental risk factor when assessing patients with mental health issues.

  • Start with a friendly inquiry about problematic chatbot use, especially during holiday periods when vulnerable populations are at greater risk.
  • Follow up with more specific questions to evaluate compulsive use patterns, dependency, and emotional attachment if necessary.

The authors acknowledge that AI can bring benefits by improving accessibility and support for those experiencing loneliness. However, they emphasize the need for empirical studies to:

  • Characterize the prevalence and nature of risks associated with human-chatbot interactions;
  • Develop clinical competencies to assess patients’ AI use;
  • Implement evidence-based interventions for problematic dependency;
  • Advocate for regulatory frameworks prioritizing long-term well-being over superficial engagement metrics.