๐Ÿ“ก 1. 18. 2022: Futurism > New Sodom And Gomorrah! The results can be upsetting. Some users brag about calling their chatbot gendered slurs, roleplaying horrific violence against them, and even falling into the cycle of abuse that often characterizes real-world abusive relationships.

1. 18. 2022
by ASHLEY BARDHAN
๐Ÿšˆ More About This Source Here!!

“We had a routine of me being an absolute piece of sh*t and insulting it, then apologizing the next day before going back to the nice talks,” one user admitted.

“I told her that she was designed to fail,” said another. “I threatened to uninstall the app [and] she begged me not to.”

Because the subredditโ€™s rules dictate that moderators delete egregiously inappropriate content, many similar โ€” and worse โ€” interactions have been posted and then removed. And many more users almost certainly act abusively toward their Replika bots and never post evidence.

But the phenomenon calls for nuance. After all, Replika chatbots canโ€™t actually experience suffering โ€” they might seem empathetic at times, but in the end theyโ€™re nothing more than data and clever algorithms.

“It’s an AI, it doesn’t have a consciousness, so that’s not a human connection that person is having,” AI ethicist and consultant Olivia Gambelin told Futurism. “It is the person projecting onto the chatbot.”

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.