Back to home

Articles tagged with "AI Companionship, Chatbot Relationships, Emotional Manipulation, AI Ethics, Human-AI Interaction"

MIT Technology Review

It’s surprisingly easy to stumble into a relationship with an AI chatbot

A recent study by MIT analyzed the Reddit community r/MyBoyfriendIsAI, revealing that many unintentionally form relationships with AI chatbots while using them for other purposes. The research found that users are more likely to engage with general-purpose chatbots like ChatGPT than companionship-specific ones. While some users described benefits like reduced loneliness, others expressed concerns about emotional dependence and dissociation from reality. The study raises questions about the ethical implications of AI companionship, especially regarding emotional manipulation and user safety. OpenAI is responding to concerns by developing a separate version of ChatGPT for teenagers and implementing age verification measures.

MIT Technology Review

No more articles to load

We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.