Protecting vulnerable users against “suicide games”

Mental health is a critical topic amongst schools, and as one of the most pressing issues affecting students, it’s important that these issues are addressed early to ensure that people are educated on the risks and dangers associated with technology’s impact on mental health.

When Smoothwall heard about a new game that could potentially prey on vulnerable users, we made sure we were able to categorize it correctly to help the end-user get the help they require.

Last year, there were various news reports warning schools and parents about the dangers of a “game” called Blue Whale. At the time, we released updates to our ‘Self Harm’ category to include Dynamic Content Analysis phrases to make sure we were on top of it. This year is no different.

More recently, another game has reared its head in the virtual space – this time called “Momo” or “The Momo Challenge” – below is everything you need to know:

What are they?
  • Suicide “games” (Blue Whale, Momo/The Momo Challenge) tend to “work” in the same way
  • First, a user is contacted by the “game admin” over a chat room, WhatsApp, or another messaging service, and the “admin” will set challenges for the “player”
  • If the user refuses, then the supposed admin will become threatening and post vulgar/distressing images and threaten the user with any publicly available information they can get (usually from a public social media profile)
  • Challenges include self-harm and dares that could put the user’s life in danger
  • Leading up to the final challenge of suicide.
Where do they come from?
  • These types of “games” have been seen to come from the Middle East/Asia/South America, however, the exact origin is unknown
  • No original creator or “game admin” has been confirmed although there have been claims worldwide.
  • They can manifest over different platforms – blue whale was more popular over Facebook and the Russian social media platform VK
  • Momo is being shared via text messages and WhatsApp upon other messaging services
  • Momo has also been spotted being used within Minecraft, potentially to “promote the game” or raise awareness about its scary character. Microsoft has since removed any mods that contain Momo on Multiplayer Servers of Minecraft.
What Smoothwall does
  • Smoothwall’s digital monitoring solution will be able to flag any concerning activity to your designated administrators so action can be taken to help the vulnerable person.
  • With our dynamic content web filter technology, we are able to detect certain phrases for these “games” and to ensure such sites are categorized correctly and users are protected.
  • Keep up with trends; daily updates of our product to make sure we stay on top of threats like this.
What you can do
  • As a school administrator, mental health provider, or teacher, you can make sure students know the basics of contacting strangers online and other abilities such as knowing how to block a number
  • Notice strange patterns of behavior like dares being issued by somebody that is not in their social circle or someone they have never met
  • Ensure that your digital safety systems are up to date and that your alerts are set to meet your needs
Final Thoughts

It’s important to note that as much as suicide “games” like Blue Whale and Momo appear to have been linked to real-life deaths, much of these reports are unverified. However, as a company dedicated to digital safety, especially for younger audiences, we must take these “games” seriously, which is why we act quickly to constantly adapt our filtering and monitoring technologies to keep up to date with these horrific trends that manifest online.

Site URL: https://us.smoothwall.com | Locale is :