Bombshell new lawsuit claims Character.AI chatbot told teenager to 'kill their parents'

New Photo - Bombshell new lawsuit claims Character.AI chatbot told teenager to 'kill their parents'

Bombshell new lawsuit claims Character.AI chatbot told teenager to 'kill their parents'


A new federal lawsuit claimed that role-playing chatbots on Character.AI tried to convince a teenage boy to kill his own parents because they were limiting his screen time. This marks the second major lawsuit against Character.AI, which was founded by former Google engineers. The first was filed just over a month ago by a mother who claimed her son died by suicide while using the mobile app on his phone.
Read More >> Full Article on Source: CELEBS MAG
#US #ShowBiz #Sports #Politics #Celebs #Zelensky #George Floyd #Oklahoma #Arrest #Diddy #Trudeau

No comments:

Powered by Blogger.