My Spouse is Having an Emotional Affair With a Chatbot
Perhaps it is a just another sign of our -digital- times. Over the past year, our law firm has interviewed divorce prospects that report their spouse having an "emotional affair" with a chatbot.
This is an interesting yet not surprising development within the family court industry given the relentless evolution of artificial intelligence [AI] and artificial general intelligence [AGI]. Nor is it surprising given human nature.
In the cases we are seeing, one spouse begins to align himself or herself emotionally with a digital personality; a BOT, if you will. The more that spouse becomes aligned with their digital partner, the more closed-off they become with their real-world spouse. Naturally, this rift widens over time.
Given this recent development, we here at Clarkston Legal are reminded of ChatGPT's roll-out earlier this year of an egregously synchophantic version of its AGI model. That is to say, this version alarmingly reinforced the user's predilictions and idosyncracies. From there, it is but a small step to imagine the model going off the rails, depending on the prompts of the user.
There are, of course, multimple platforms on which users can develop significant other chatbots. Joi AI and JanitorAI are free; perhaps the most popular site, Replika, costs $19.99 per month. Replika has a surprising thirty million users. Character AI is now infamous; the first platform known to be sued for wrongful death when one of its chatbots allegedly inticed its teenaged partner to commit suicide.
The Statistics on Romantic Chatbot Partners
In our law firm's half-century of family court experience, "emotional affairs" came onto the scent about a decade ago. An "emotional affair" is when a spouse shares intimate information with a non-spouse; often a co-worker. Emotional affairs with chatbots, however, is a relatively new phenomena, at least in our law firm.
Apparently, despite our decades of experience, we were behind the times on this digital emotional affair phenomena. Bringham Young's Wheatley Institute published a recent study -here is a link- concluding that nearly one in five adults in the United States have at least "broke the ice" by chatting with a digital romantic partner. One in five!
For the twenty something Gen Z-ers, the contacts and the attitude toward romantic chatbots is as common as using an on-line banking app. According to a poll conducted by Joi AI:
83% of Gen Z believes that they could form a "deep emotional bond" with a chatbot;
80% imagined they could marry a chatbot; and
75% opined that digital relationships could fully replace human relationships.
Interesting but not shocking statistics given the mechanics of AI models, especially those devoted to romantic relationships.
Romantic Chatbot Safety Concerns
As America's love affair with romantic chatbots explodes, inevitable safety concerns have arisen. The more deferentially synchophantic the model, the greater the risk of the user actually falling in love with the chatbot. And the greater and more ominous the risk of the user prompting the AI model to go plunging off the romantic cliff.
The prompt/answer format of most AI models reinforces users' delusions. For younger [i.e. minor] users and users with mental illness, this format is particularly risky. For marriages, it is just plain problematic.
Of course, within a marriage, extramarital purient interests put pressure on any couple's relationship, regardless of the source. Even when the "significant other" is a digital persona, the marriage is threatened. Although we could find no confirming statistics, we would surmise here at Clarkston Legal that nearly 100% of extramarital chatbot relationships have an overly sexual dimension.
In 2023, Replika made a surprising and significant alteration of its AI model by banning erotic role playing. Users were outraged. Global safety concerns were cited by the AI model's parent company Luka. ChatGPT also put restrictions on chatbot erotica, althoght Sam Altman, the AI model's chairman, recently announced that these restrictions were removed. Altman said the risks of chatbot erotica were mitigated through the development of unspecified "new tools" in the AI model.
Now, all AI models are facing the challenge to strike the right balance between a user's freedom and the imposition of reasonable content safeguards.
Chatbot Wrongful Death Lawsuits
As mentioned, Character AI is being sued in federal court for the wrongful death of a teenager romantically linked to a chatbot fashioned from the Game of Thrones Princess Daenerys Targaryen. You had to know this case was coming.
This case, the first of its kind, alleges a defective design theory whereby the teenaged decedent was subjected to, "highly sexualized, depressive andromorphic encounters" leading to "addictive, unhealthy and life-threatening behaviors." Character AI asserts in its defense to the wrongful death claim that the words produced by its chatbot in response to the user's prompts constitute speech protected by the First Amendment.
The case is scheduled for trial this November with high-stakes appeals to follow any outcome short of a global settlement.
ChatGPT is also being sued on a wrongful death claim involving a teen aged user's prompts concerning the use of a noose and the AI model's responses to those prompts. Again, a defective design theory is alleged against the AI model.
Romantic Chatbots and Divorce
Regardless of whether chatbots will be deemed to have the same protections of free speech as us humans; and regardless of whether government regulators will be able to issue censorship over a chatbot's content, marriages will be put under increasing pressure by the proliferation of romantic chatbots.
The fact is, in our post-modern world, adulterous conduct with actual humans does not move the needle too much in most divorces. Therefore, while a romantic chatbot may bring a marriage to its knees, it may not factor into the resolution of the subsequent divorce.
Nevertheless, romantic chatbots will, no doubt, precipitate an increasing number of divorces in the future. Apparently, Gen Z is predisposed to foster romantic chatbot relationships.
We Can Help
If you are facing the strain of a partner that has invested in an emotional affair, digital or human, contact our office to assess your options.