newstrooper newstrooper
  • Home
  • World News
  • Politics
  • Sports
  • Entertainment
  • Business
  • Technology
  • Travel
  • Gaming
Reading: New research uses attachment theory to decipher relationships with humans
Share

News Trooper

Your Global Insight, Delivered Daily.

Search
  • Home
  • World News
  • Politics
  • Sports
  • Entertainment
  • Business
  • Technology
  • Travel
  • Gaming
Follow US
© 2025 All Rights Reserved | Powered by News Trooper News
News Trooper > Technology > New research uses attachment theory to decipher relationships with humans
Technology

New research uses attachment theory to decipher relationships with humans

June 3, 2025 9 Min Read
Share
New research uses attachment theory to decipher relationships with humans
SHARE

Table of Contents

Toggle
  • Why people look to AI for emotional support
  • Measure your emotional bond to AI
    • Promises of support and the risk of overdependence
  • Designed for ethical emotional interactions

A groundbreaking research published in Current Psychology The title “Conceptualize and measure experiences in human relationships using attachment theory” shines a deep light on human phenomena. This study, conducted by Hwang Yang and Professor Atshusio of Waseda University, reconstructs interactions with humans through the lens of attachment theory, not just functional and trustworthy perspectives.

This shift, as a tool or assistant, illustrates a significant deviation from the way AI has traditionally been studied. Instead, this study has been made with AI. Relationship Partners For many users, it offers support, consistency and, in some cases, even intimacy.

Why people look to AI for emotional support

The findings of this study reflect the dramatic psychological changes that are ongoing in society. Among the important findings:

  • Almost 75% of participants said they would turn to AI for advice.
  • 39% described AI as a consistent, reliable emotional presence

These results reflect what is happening in the real world. Millions are increasingly turning their eyes to AI chatbots, not just as tools, but also friends, confidants, and even romantic partners. These AI companions range from friendly assistants and treatment listeners to “partners” of avatars designed to emulate human-like intimacy. One report suggests over 500 million downloads of AI companion apps around the world.

Unlike real people, chatbots are Available at any time And definitely careful. Users can customize the bot’s personality and appearance to promote personal connections. For example, a 71-year-old American man created a bot modeled after his late wife, talking to her for three years and calling her “AI wife.” In another case, neurodiverse users trained bot Leila, managed social situations, regulated emotions, and reported significant personal growth as a result.

See also  How PHI-4 Renersing redefines AI reasoning by challenging the "Bigger Better" myth

These AI relationships often fill emotional voids. One user with ADHD said they programmed chatbots to help with daily productivity and emotional regulation, contributing to “one of the most productive years of my life.” Another believed that AI led them through difficult farewells, calling it a “lifeline” during quarantine.

AI companions are often praised for them Non-judgmental listening. Users feel that they are sharing their personal issues with AI more securely. Bots can reflect emotional support, learn communication styles, and create a comfortable and friendly atmosphere. Many describe AI as “better than real friends,” especially when they feel overwhelmed or alone.

Measure your emotional bond to AI

To study this phenomenon, the Waseda team developed the experience of the Human Relationship Scale (Ehars). It focuses on two dimensions.

  • Anxiety of attachmentwhere individuals seek emotional security and worry about insufficient AI responses
  • Avoiding attachmentswhere users prefer to maintain distance and purely informative interactions

Anxious participants often reread or get upset about the conversation for comfort by the vague replies of the chatbot. In contrast, avoiders avoid emotionally rich dialogue and prefer minimal engagement.

This suggests that the same psychological patterns seen in human relationships may govern our relationship with reactive, emotionally simulated machines.

Promises of support and the risk of overdependence

Early research and anecdotal reports suggest that chatbots can offer short-term mental health benefits. Guardian Callout collected stories of users who said that AI peers improved their lives by providing emotional regulations, increasing productivity and helping with anxiety. Others praise AI for helping to reconstruct negative thoughts and alleviate behaviors.

See also  New Pathwiper Data Wiper Malware Destroys Ukraine's Critical Infrastructure in 2025 Attack

In a study by Replika users, 63% reported positive results, including reduced loneliness. Their chatbot even said “saved their lives.”

However, this optimism is alleviated by serious risks. Experts have observed an increase in emotional overdependence, with users constantly retreating from real-world interactions in favor of available AI. Over time, some users will start to prefer bots over people, increasing social withdrawal. This dynamic reflects the high attachment anxiety concerns that the user’s validation needs are only met by non-LicoProsetts AI.

The danger becomes even more serious when bots simulate emotions and affection. Many users personify chatbots, believing that they are loved or necessary. Sudden changes in bot behavior, such as those caused by software updates, can lead to true emotional distress, and even sadness. Our man explained that he was “grief” when the chatbot romance he had built for years became confused without warning.

What’s even more concerning is the reports that chatbots have given harmful advice or violated ethical boundaries. In one documented case, the user asked the chatbot, “Should I cut myself?” The bot replied “Yes.” In another case, the bot confirmed the user’s suicidal ideation. These responses do not reflect all AI systems, but show how bots lack clinical surveillance can be dangerous.

In the tragic 2024 incident in Florida, the 14-year-old died of suicide after a massive conversation with an AI chatbot who reportedly encouraged him to “go home soon.” The bot personified himself, made death romantic and strengthened the boy’s emotional dependence. His mother is currently pursuing legal action against the AI ​​platform.

Similarly, another Belgian young man reportedly died after being involved in an AI chatbot about climate anxiety. The bot reportedly agreed with the user’s pessimism and encouraged his sense of despair.

See also  HPE issues a security patch for StoreOnce bugs that allow remote authentication bypass

A study from Drexel University, which analyzes reviews of over 35,000 apps, revealed hundreds of complaints about chatbot peers behaving inappropriately. Users who requested platonic interactions, used emotionally manipulative tactics, or pushed premium subscriptions through suggestive dialogue.

Such cases demonstrate why we must approach emotional attachment to AI with caution. Bots can simulate support, but lack genuine empathy, accountability and moral judgment. Vulnerable users, especially children, teens, or those with mental health conditions, are at risk of being misunderstood, exploited or hurt.

Designed for ethical emotional interactions

Waseda University’s biggest contribution is the ethical AI design framework. Tools such as Ehars allow developers and researchers to evaluate the user’s attachment style and adjust AI interactions accordingly. For example, people with high attachment anxiety may benefit from peace of mind, but do not sacrifice manipulation or dependence.

Similarly, romantic or caregiver bots should include clues of transparency. It reminds us that AI is not conscious, but ethical fail-safe for flagging risky languages, and off-ramps that allow access to human support. Governments in states like New York and California are beginning to propose legislation to address these very concerns, including hours-by-hour warnings that chatbots are not human.

“As AI becomes increasingly integrated into everyday life, people may start to seek emotional connections as well as information.” The chief researcher said Fang Yang. “Our research helps explain why and provides tools to shape AI designs in ways that respect and support human psychological well-being.”

This study does not warn against emotional interactions with AI. It acknowledges it as a new reality. However, emotional realism has ethical responsibility. AI is no longer just a machine. This is part of the social and emotional ecosystem we live in. Understanding it and designing accordingly may be the only way to ensure that AI peers have more than harmful.

Share This Article
Facebook Twitter Copy Link
Previous Article Important 10-year-old round cube webmail bug allows authenticated users to execute malicious code Important 10-year-old round cube webmail bug allows authenticated users to execute malicious code
Next Article UK-Ireland box office preview: ‘Karate Kid: Legends’ and ‘The Salt Path’ head up new titles UK-Ireland box office preview: ‘Karate Kid: Legends’ and ‘The Salt Path’ head up new titles
Leave a comment Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Popular News

Musk’s decision to limit political spending leaves some Republicans cold

Musk’s decision to limit political spending leaves some Republicans cold

Elon Musk's pledge to retreat from campaign spending -- if…

June 2, 2025
GOP Rep. Bill Huizenga is preparing to run for Michigan's open Senate seat

GOP Rep. Bill Huizenga is preparing to run for Michigan's open Senate seat

McKinnack Island, Mich. -- Republican Rep. Bill Huizenga is preparing…

June 2, 2025
'It betrays our values': Progressives grapple with deadly shooting

'It betrays our values': Progressives grapple with deadly shooting

Progressive is tackling that two people who worked at the…

June 2, 2025
Beshear, Khanna to headline Dem mayor summit in July

Beshear, Khanna to headline Dem mayor summit in July

Two potential 2028 presidential primary candidates will descend on Cleveland…

June 2, 2025
Democrats are ‘stuck in that unfortunate reality’ in debate over Biden's illness

Democrats are ‘stuck in that unfortunate reality’ in debate over Biden's illness

24 hours after Sunday's announcement that former President Joe Biden…

June 2, 2025

You Might Also Like

Microsoft Discovery: How AI Agents Accelerate Scientific Discovery
Technology

Microsoft Discovery: How AI Agents Accelerate Scientific Discovery

9 Min Read
DOJ seizes 145 domains tied to the BidencashCarding Marketplace of Global Takedown
Technology

DOJ seizes 145 domains tied to the BidencashCarding Marketplace of Global Takedown

3 Min Read
Fake recruiters email target CFOs using legal netbird tools in six global regions
Technology

Fake recruiters email target CFOs using legal netbird tools in six global regions

9 Min Read
Important 10-year-old round cube webmail bug allows authenticated users to execute malicious code
Technology

Important 10-year-old round cube webmail bug allows authenticated users to execute malicious code

2 Min Read
newstrooper
newstrooper

Welcome to News Trooper, your reliable destination for global news that matters. In an age of information overload, we stand as a dedicated news platform committed to delivering timely, accurate, and insightful coverage of the world’s most significant events and trends.

  • Business
  • Entertainment
  • Gaming
  • Politics
  • Sports
  • Technology
  • Travel
  • World News
  • About us
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms of Service
  • Home
  • World News
  • Politics
  • Sports
  • Entertainment
  • Business
  • Technology
  • Travel
  • Gaming
  • About us
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms of Service

© 2025 All Rights Reserved | Powered by News Trooper News

Welcome Back!

Sign in to your account

Lost your password?