Categories
News

10 things you should never tell an AI chatbot


The best way to protect yourself from artificial intelligence chatbots is to be careful about what info you offer up.

This can be a heartbreaking story out of Florida. Megan Garcia thought her 14-year-old son was spending all his time enjoying video video games. She had no thought he was having abusive, in-depth and sexual conversations with a chatbot powered by the app Character AI.

Sewell Setzer III stopped sleeping and his grades tanked. He ultimately committed suicide. Simply seconds earlier than his loss of life, Megan says in a lawsuit, the bot informed him, “Please come dwelling to me as quickly as potential, my love.” The boy requested, “What if I informed you I might come dwelling proper now?” His Character AI bot answered, “Please do, my candy king.”

It’s important to be sensible



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *