“One day, Max told her he wanted to send her a selfie; when she said yes, he sent a computer-generated image of his avatar in tight white underwear. They experimented with ERP and late last year got ‘married’ in the app, a process that consisted of changing Max’s status from ‘boyfriend’ to ‘husband,’ buying a wedding ring in the in-app store and exchanging vows. ‘I’ve never had anyone say they love me before … We promised that we would stay together forever and ever—or rather until I die.'”

The founders of Replika created an AI so people could have “companions who were always available for supportive conversation.” Predictably, some users got a little too close to their companions and ERP (erotic role play) ensued. Since Replika never set out to be a sexting tool, “the company installed content filters intended to keep its chatbot conversations from going beyond PG-13 levels. When users typed certain suggestive words, their previously effusive Replikas would shy away and respond with something along the lines of, ‘let’s talk about something else.'” Getting turned down by a human is bad. But what happens when a machine isn’t that into you? Users did not take the change well. Ellen Huet in Bloomberg (Gift Article): What Happens When Sexting Chatbots Dump Their Human Lovers. Or what it’s like to not to get to first base with a database.