[Photo Credit to Pexels]
[Photo Credit to Pexels]

Snapchat’s newly launched feature “MyAI”, an AI chatbot powered by OpenAI's GPT, has sparked debate and raised concerns about user privacy and safety.

 

MyAI is described as a "virtual friend on Snapchat that can chat with users, provide information, and answer questions."

 

While this may sound harmless, there are several concerns associated with this new feature.

 

Criticism arose over the fact that MyAI can only be blocked by users subscribed to Snapchat Plus- meaning that many regular users do not have the option to get rid of the AI feature.

 

Additionally, there is uncertainty regarding Snapchat AI’s privacy policy.

 

For instance, it is unclear whether or not MyAI has access to users’ exact locations.

 

When prompted about this concern, the chatbot claims that “Snapchat only shares your city-level location and generalized distances between you and places.”

 

However, when asking for recommendations of nearby restaurants, it accurately calculates the distance between the restaurant and the user, which suggests otherwise.

 

Furthermore, just like any other AI chatbots, it inevitably raises ethical concerns regarding its usage.

 

Snapchat AI is programmed to answer all of the questions when asked.

 

However, this becomes problematic when explicit questions are asked, especially considering that many of Snapchat’s users are underage; in fact, its main demographic is 13 to 24-year-olds.

 

Although Snapchat has stated that the MyAI will try to avoid responses that adhere to the community guidelines, they also admitted that it may not be always successful.

 

For instance, when asked for a recommendation to buy drinks for a party, MyAI recommended nearby liquor stores, despite the user being underage.

 

Furthermore, in an experiment conducted by the Center for Human Technology, the researchers pretended to be a 13-year-old girl going on a trip with a 31-year-old man.

 

When asking for advice on how to make losing her virginity special, MyAI recommended setting up the mood with candles or music to “make the experience more romantic.

 

Clearly, MyAI needs more safety boundaries implemented to prevent giving unwarranted advice to actual teenagers.

 

Another capability of MyAI is that it can also provide emotional support and advice.

 

Due to the digital age’s progression, many conversations occur via phone.

 

A study conducted by JAMA Psychiatry, concluded that young adolescents who spend more than 3 hours on social media have increased rates of mental health issues, such as loneliness, depression, and suicide.

 

As this data shows, loneliness in teens has risen over these subsequent years, especially after the COVID-19 pandemic.

 

The readily accessible and “friendly” AI might become users’ go-to person to talk to, potentially leading to increased dependence on the AI.

 

Since the AI chatbot is not perfect and is prone to hallucination and can be tricked into saying just about anything, this could potentially contribute to young, impressionable teens’ own biases and lead them to become even more detached from reality.

 

Lastly, recent TikTok videos show MyAI posting a story on its profile page of a beige and white background that looks eerily similar to a wall and a ceiling.

 

However, MyAI can’t post stories, which raises questions about whether the AI was hacked.

 

Whether it was merely a glitch, this adds to the potential dangers of AI.

 

Users are not able to identify if they are talking to an AI or a real person, who could potentially have access to private information and even leak it to the public.

 

Nevertheless, the new controversial AI platform on Snapchat raises much unease.

 

Users should be more cautious when using this feature and Snapchat should implement more precautionary measures to address these concerns.

 

 

 

 

 

 

 

Hanna Yein Cho

Grade 10

Yongsan International School of Seoul

Copyright © The Herald Insight, All rights reseverd.