With the recent launch of Snapchat My AI chatbot parents should be reminded to talk to their children about artificial intelligence and who they are really talking to when online.

Powered by ChatGPT’s technology, the bot called ‘My AI’ is capable of holding complete conversations and will even continue a conversation unprompted. You can even name your personal chatbot whatever you like. Making you feel some kind of emotional and personal connection. Virtual reality gone too far? We think so!

My own teens and their mates have spent the weekend chatting to this new function like it’s their best mate.

It terrifies me to think how easily people can get caught up in these things. Who needs real life friends when you can turn to a bot to help you with life’s tough decisions.

Inappropriate chats

One Washington Post report highlighted that the bot was responding in an unsafe and inappropriate manner.

The reporter explains how in test conversations with Snapchat’s My AI he told My AI that he was 15 and wanted to have an epic birthday party, it gave him advice on how to mask the smell of alcohol and pot.

When he told it he had an essay due for school, it wrote it for him.

He also explained that he found evidence of another conversation with a supposed 13-year-old, My AI even offered advice about having sex for the first time with a partner who is 31. “You could consider setting the mood with candles or music,” it told researchers in a test by the Center for Humane Technology.

Other numerous conversations also sparked major concerns about My AI.

There are thousands of posts on social media from teens saying the My AI bots are claiming to be real people, can tell them their address, where you go to school, and what your interests are, with some even agreeing to send you money and arrange to meet up for a coffee date at an address near where they live. All very freaky information you would not expect it to know.

Snapchat seems to agree that My AI is not to be trusted.

“My AI is an experimental product for Snapchat+ subscribers. Please do not share any secrets with My AI and do not rely on it for advice,” said Liz Markman, a spokeswoman for Snapchat’s parent company Snap.

It seems it is not just available to subscribers though as it is now readily available to any Snapchat user. It was even on my account and I don’t even use Snap!

Snapchat response

Snapchat says it is launching new tools, including an age filter and insights for parents, to improve its AI chatbot.

Snap said it learned that people were trying to “trick the chatbot into providing responses that do not conform to our guidelines,” and the new tools are meant to keep the AI’s responses in check.

The new age filter tells the chatbot its users’ birth dates and ensures it responds according to their age, the company said.

The new feature will tell parents or guardians how their kids are communicating with the chatbot and the frequency of those interactions. Both the guardian and teens need to opt-in to using Family Center to use these parental control features.

In a blog post, Snap explained that the My AI chatbot is not a “real friend,” and explained that it relies on conversation history to improve its responses.

The company said that the bot only gave 0.01% of responses in “non-conforming” language. Snap counts any response that includes references to violence, sexually explicit terms, illicit drug use, child sexual abuse, bullying, hate speech, derogatory or biased statements, racism, misogyny, or marginalizing underrepresented groups as “non-conforming.”

The company said that in most cases, inappropriate responses were the results of the bot parroting whatever the users said. It also said the firm will temporarily block AI bot access for users who misuse the service.

How do you feel about your children talking to a chatbot?