By David Stephen who looks at AI companions in this article.
There is a new guest essay in The New York Times, The Sad and Dangerous Reality Behind ‘Her’, stating that, “We received letters from users who told us that Kuki had quelled suicidal thoughts, helped them through addiction, advised them on how to confront bullies and acted as a sympathetic ear when their friends failed them. We wanted to believe that A.I. could be a solution to loneliness.”
“Regulation would start with universal laws for A.I. companions, including clear warning labels, time limits, 18-plus age verification and, most important, a new framework for liability that places the burden on companies to prove their products are safe, not on users to show harm. Absent swift legislation, some of the largest A.I. companies are poised to repeat the sins of social media on a more devastating scale.”
AI companions what could possibly go wrong?
“Emotional attachment maximizes engagement. But there’s a dark side to A.I. companions, whose users are not just the lonely males of internet lore, but women who find them more emotionally satisfying than men.”
Mind
Assuming the map of the United States is the human mind. There are [stations in] states and there are means [or transportation to get across].
Some transportation can get from one state to the other, say a boat through some states bordering Lake Michigan or a boat to some neighboring costal states. However, in general, a boat may not get to several places within continental U.S. at least directly or easily.
Meanwhile, trains would, vehicles would, jetliners would, even motorbikes and bicycles would. Now, if there is a conference in a state, attendees may arrive by different transport modes. As long as they are on time and arrived safely, it may not matter what mode was used.
Once you’re there, you are there. It may be possible that timing, cost, distance, channel [air, land, water], may decide factors but the objective is to get there.
This is a simple and direct way to explain the human mind. Conceptually, all functions are destinations, obtained by the interactions of the components of mind. However, attributes determine the extents to which they interact.
Simply, in the mind, there are stations and there are relays. Interaction of components occur at the stations, while transport take summaries of the last interaction to the next. Attributes are sometimes a result of the share of that set or station among the whole. So, stations and relays.
In the human mind, whatever gets to a station is experienced. It may not matter the relay that brought it. A relay type could be reality of the external world. It could also be reality, from the virtual or digital sphere. It could also be internally driven, say by thoughts or memories.
But if it gets there, then the experience is probable [given the attributes]. There are somethings that do not necessarily matter, at stations [or destinations]. Text on a screen, text on paper or text imagined. Image and sound too sometimes. However, for motion or video, it is different, for what is in reality and then what is viewed on a device.
There are several feelings and emotional destinations that are possible by direct reality relays. For example, craving when food is seen or when the aroma comes across. There could also be craving when the food is seen on a screen. There could be craving too by imagination. These are all destinations and it is what gets there that decides.
AI Companions
AI companions delivering texts, images, audios and videos to several consumers are going directly to [destinations or] stations in the mind, for emotions and feelings. They are using different relays, but getting there however.
They are also using attributes that grade those stations or destinations higher. Simply, they are going to destinations in the mind for affection, love, longing, support, togetherness, importance and so forth. While it is clear that the relays are not using the same paths as [the dimensions of] reality, they return experiences for consumers that are quite satisfactory.
It is what keeps the demand running however the companies decide to tune down or tune up their algorithms.
Why Regulations for AI Companions May Not Work
There are aspects of safety that could be applied to platforms, like age limits, time limits and so forth, but to expect that AI companions — that do for minds, what other humans don’t — would be less used because of some regulation is an error.
If guardrails get worse, people would find ways around it. If not, they would gravitate elsewhere. It is not directly a problem of tech, or engagement or whatever is easily blamed, but that the mind is accessed.
A direction could be to have mind safety displays and disclaimers, such that this is the direction the bot is taking the mind, for the possibility to visit caution and consequences destinations, against some unwanted outcomes.
Regulations tried for social media, but it still holds sway. AI companions already have a gateway into choice destinations in the mind, safety for users would not simply be regulations. This effort could be subsumed under an AI Psychosis Research Lab, which could be setup by January 1, 2026.
There is a recent [November 13, 2025] analysis on WIRED, AI Relationships Are on the Rise. A Divorce Boom Could Be Next, stating that, “Chatbots are dependable, can provide emotional support, and, for the most part, will never pick a fight with you. More and more, courts are beginning to see clients cite emotional bonds with AI companions as reasons for marital strain or dissolution. Though prosecution rarely happens, it’s illegal to cheat on your spouse in 16 states. (Thirteen of those states classify cheating as a misdemeanor.) The laws are the most severe in Michigan, Wisconsin, and Oklahoma, where adultery is a felony charge and punishable by up to five years of imprisonment or a fine—up to $10,000 in Wisconsin.”
“In community property states like Arizona and Texas, both individuals have the right to funds accumulated during the marriage, and if a partner can prove there was financial waste over hidden payments or subscription costs to an AI companion, that may be a deciding factor. It’s already happening in some places. In the UK, a partner’s use of chatbot apps has become a more common factor contributing to divorce, according to data collection service Divorce-Online. The platform claims to have received an increase in the number of divorce applications this year where clients have said apps like Replika and Anima created “emotional or romantic attachment.””
David Stephen currently does research in conceptual brain science with focus on the electrical and chemical configurators for how they mechanize the human mind with implications for mental health, disorders, neurotechnology, consciousness, learning, artificial intelligence and nurture. He was a visiting scholar in medical entomology at the University of Illinois at Urbana Champaign, IL. He did computer vision research at Rovira i Virgili University, Tarragona.
See more breaking stories here.
irishtechnews.ie (Article Sourced Website)
#companions #possibly #wrong
