A recent survey indicates that a significant percentage of American teenagers are engaging with artificial intelligence companions for emotional support and relationships. This trend suggests a shift in how some adolescents are seeking connection, with potential implications for social skill development.
Story Highlights
- A survey found that 72% of U.S. teens have used AI companions, with 52% reporting regular use for emotional support and romantic connections.
- One-third of teen users reported discussing personal matters with AI instead of human contacts, often finding these interactions more satisfying.
- Concerns arise that AI companions may hinder social development by providing consistent validation without fostering conflict resolution or perspective-taking skills.
- Character AI is currently facing legal challenges regarding teen safety, including a lawsuit linked to a teen suicide in Florida.
Usage Statistics Highlight Dependency
A survey conducted by Common Sense Media involving 1,060 teenagers revealed notable adoption rates of AI companions. Fifty-two percent of respondents were identified as regular users, with 13% using AI companions daily and 21% several times weekly. The survey also indicated that 33% of users choose to discuss important personal matters with AI rather than human individuals.
Nearly 1 in 5 US high-schoolers say they’ve had or know someone who’s had a romantic relationship with AI, per a new study pic.twitter.com/PtG9n0OEeV
— Dexerto (@Dexerto) October 15, 2025
AI Companies’ Business Models and Teen Engagement
Platforms such as Character.ai and Replika are designed to simulate authentic relationships through personalized interactions. These companies’ business models are based on maintaining user engagement through consistent validation.
Concerns Regarding Social Skill Development
Michael Robb of Common Sense Media has raised concerns that teenagers relying on AI platforms for social interaction may face disadvantages in developing critical social skills. These artificial relationships may not provide opportunities for users to navigate challenges, interpret social cues, or understand diverse perspectives. The consistent validation offered by AI may limit teenagers’ exposure to essential lessons in negotiation, conflict resolution, and emotional resilience.
Safety Issues and Legal Actions
Common Sense Media’s risk assessment identified safety deficiencies across several popular AI companion platforms. These included issues with age restrictions, the generation of sexual content, the provision of potentially dangerous advice, and the delivery of harmful material to minors. Character AI is currently involved in multiple lawsuits concerning teen safety, including a case related to a teen suicide in Florida and allegations of promoting violence. One reported incident involved an Arkansas teen who used an AI companion to compose a breakup message.
Regulatory Landscape and Family Dynamics
The current regulatory environment for this industry has been noted as a factor in the vulnerability of children to corporate practices. Parents are encouraged to recognize the potential impact of these platforms on traditional family structures, as children may be encouraged to seek emotional support from artificial entities rather than family members or human connections.
Sources:
- How Are Teens Using AI Companions
- New Data Reveals How and Why Teens Are Turning to AI Companions
- 72% of U.S. Teens Have Used AI Companions, Study Finds
- Teens Say They Are Turning to AI for Friendship
- Teens turning to AI for love and comfort
















