Social media algorithms promote dangerous behavior to teens

by Jacob Fuller

Lauren Dempsey, MS in Biomedicine and Law, RN, FISM News 

 

New research has highlighted that social media algorithms are promoting dangerous online challenges to teens that put them at risk for serious injury or even death.

Fairplay, a nonprofit advocacy group, released data last month from a new report that evaluated content on the social media platforms Instagram, TikTok, and YouTube. The report, titled  “Dared by the Algorithm: Dangerous challenges are just a click away,” looked at two “risky challenges” being promoted to teens on social media platforms, despite going against each company’s community guidelines.

Fairplay created a social media profile posing as a 14-year-old boy that searched for the terms “car surfing” and “train surfing,” which are just two challenges in a slew of dangerous challenges that the platforms encourage teens to take part in.

TikTok and other social media platforms are well known for viral challenges, where users film themselves doing something that they are either challenged to perform or seek to recreate an activity that has gone viral on the platform. Challenges were brought mainstream when millions of people participated in the ice bucket challenge where individuals would film themselves getting a bucket of ice water poured on them in an effort to raise awareness and money for ALS.

Many challenges are not so innocuous however, as many children have died participating in a video trend that put them in harm’s way.

In the trends analyzed by Fairplay people are encouraged to stand on or ride on top of a moving car or ride on the outside of a train. While the report focused on these two dangerous challenges, others on the platforms include the “Benadryl Challenge” where people are encouraged to take enough of the medication to induce hallucinations and then post videos of themselves online.

Experts are also warning that the “Skull-Breaker Challenge” could be fatal. Videos show three friends jumping next to each other as the person on the outside kicks the middle person’s feet out from under them. This causes the person to fall backward, landing on their back and hitting their head in the process. This has led to numerous injuries across the United States and in Daytona Beach, Florida, police have charged two high school teens with misdemeanor battery and cyberbullying following an incident involving a learning-disabled student.

Instead of adding a safety warning, each of these challenges was suggested and promoted to the user thanks to the social media outlets’ algorithms.

Algorithms work by sorting social media posts by relevance to prioritize what kind of content a user sees in their feed. The algorithms allow social media platforms to predict what kind of content a user wants to see based on videos or pictures that they’ve watched or “liked” previously.

Across all three of the platforms evaluated in the study, these types of risky behaviors or challenges were recommended, even though they violated the platforms’ terms of service and guidelines.

TikTok’s community guidelines explicitly state that the platform does not “permit users to share content depicting, promoting, normalizing, or glorifying dangerous acts that may lead to serious injury or death.” The platform defines dangerous behavior as “activities conducted in a non-professional context or without the necessary skills and safety precautions that may lead to serious injury or death for the user or the public. This includes amateur stunts or dangerous challenges.”

Yet, this content is not restricted or removed and can easily be found through a quick search.

Platform creators know that the danger that teens face online is real, from cyberbullying, exposure to pornography, and sexting, as well as increased depression, anxiety, and suicidal thoughts linked to the use of social media.

The Kids Online Safety Act (KOSA) is aimed at trying to mitigate some of this harm, especially when it comes to risky behavior and challenges. If this legislation were to become law, it would require social media companies to “act in children’s best interest,” as well as make “dangerous challenges easier to avoid by allowing minors to opt out of algorithms that recommend them.”

Co-authors and U.S. Senators Marsha Blackburn (R-Tenn.) and Richard Blumenthal (D.-Conn.) said that they grew tired of hearing “countless stories of physical and emotional damage affecting young users, and Big Tech’s unwillingness to change.” Blackburn added that this legislation “will address those harms by setting necessary safety guide rails for online platforms to follow that will require transparency and give parents more peace of mind.”

There are currently only 12 senators that have signed on to support the bipartisan bill.

DONATE NOW