Will Banning TikTok Make Kids Safer Online? It’s More Complicated Than That

[ad_1]

Ask just about anyone what’s behind the downward spiral of youth mental health today, and chances are that social media will be on their list of causes.

While it’s true that young people are increasingly struggling with mental health issues at the same time social media usage is ballooning, today’s available research simply hasn’t found one of those to be the driving force behind the other — in sum, correlation does not equal causation.

That’s one of the findings by a committee tasked by the National Academies of Sciences, Engineering, and Medicine with looking into social media and its impact on children’s health and well-being. The committee’s roughly 250-page report also made recommendations for governmental policies and future research on the topic.

The relationship between social media and mental health is nuanced and different for each person, says Stephanie M. Reich, a professor of education at the University of California, Irving, School of Education. Current research is limited to estimates of the number of children and adolescents who use various social media platforms and for how long.

The amount of screen time kids get is a common concern, Reich explains, but she argues it’s not necessarily a bad thing considering how some kids might be reaching for a device to find social support — like many LGBTQ+ teens do — or to avoid conflict that’s going on in the house.

“I’m not saying that screen time is not important, but it’s not nuanced enough to really understand mechanisms of change, benefit, or harm,” Reich says. “And so what we found in synthesizing all the research out there is that there are not really great metrics of what kids are doing, with whom, and why.”

While the U.S. House of Representatives recently passed a bill that would ban the popular social media platform TikTok — albeit for concern over China’s access to data — states like Oklahoma and Florida are considering laws that would tighten age restrictions for social media users.

But the committee report says that keeping kids off social media isn’t going to solve any problems.

“The unique vulnerability of young people to toxic content or misinformation is clear, but, in the committee’s assessment, broad restrictions to their online access are neither practical nor desirable,” the committee wrote. “It is therefore necessary to create both an online environment that protects young people and social media consumers who are empowered to protect themselves.”

Media Literacy Education

Many students start using social media when they’re in elementary school, Reich says, before they’re typically presented with school-based education on digital media literacy.

While social media platforms theoretically limit users from creating an account until they’re 13, kids can bypass that by simply lying about their birthday year during the sign-up process.

The 13-year-old threshold isn’t based on developmental research, Reich’s area of expertise, but was set by the lawmakers who created the Children’s Online Privacy Protection Act.

“In fact, one might argue that 13 is probably one of the more vulnerable ages to release all restrictions or oversight,” she says. “As these spaces have unfolded, they’re not like you’re online or offline. It’s just your life. It’s part of the context of childhood and adolescence now.”

Whether it’s called media literacy, digital citizenship or something else, the type of education that helps students safely navigate life online varies from school district to district, according to the committee report, and it’s up to state boards of education to ensure that curriculum is consistent.

“Our report doesn’t say exactly what needs to be in the content, but it’s clear that there needs to be a focus in this area,” Reich says, “and they have to have more of a prevention and capacity-building component rather than just an intervention later.”

Not only that, any policy directives have to come with funding and support, the committee urges. Teachers who deliver digital literacy education also need more training to keep up with the ever-changing technology — like the major developments that emerged while the report was being completed — that is part of their students’ lives.

GPT-4, Google’s Gemini AI and new apps that made deepfakes easier to create came out before the committee’s report was released in December 2023.

“In less than one year, technology’s already changed that much in ways that are highly important for kids to understand. So our push was not about, ‘Watch out for social media and mental health,’” Reich says. “It was really about having an educational system that was going to help children understand these online spaces, like how they work. If you understand algorithms, you can understand more about push content or persuasive design or the ‘stickiness’ of social media.”

Digital Design for Kids

When children use social media platforms, there’s a host of things that can impact their experiences, according to the committee report. Algorithms designed to keep users on the app can pack their feeds with sensational content, publicly tally the “likes” and shares of users’ posts, or turn the experience into a game with “badges.” The more time users spend on a platform, the more money a social media company stands to make from advertisements.

This seeming competition for attention can be particularly hard for adolescent users to turn away from.

“Heightened sensitivity to rewards can make the necessary task of disengaging from social media difficult for adolescents, while the desire for independence can make digital spaces especially appealing,” the report committee writes, “allowing teenagers room to make connections and signal their identity without the same parental scrutiny that their in-person interactions might draw.”

The committee’s report outlines how social media companies can adopt “age-appropriate design,” which includes gathering only necessary data from young users. It also shields them for “persuasive design” features meant to keep users online longer or entice them to spend money.

While the experience of social media will differ depending on the child — a euphoric teen may engage with their online world differently than a teen with depression, Reich points out — researchers simply don’t have access to data from platforms that would allow them to dig deeper into how it impacts young people.

But companies keep a tight rein on their data, making it hard for outsiders to judge whether they’re making a meaningful effort to protect children and adolescents from what the report calls “habit-forming” features on a platform.

“Allowing researchers and civil society watchdogs access to social media data and review of their algorithms would allow for a better understanding of how social media platforms influence young people for better or worse,” according to the report.

The report recommends that the International Organization for Standardization host a working group of experts to standardize how apps are developed based on users’ ages, “with an emphasis on protecting their privacy.” The same group could also find a way for social media companies to safely share data that researchers could use to find more concrete links between social media use and health.

“There’s times where individuals have tried to give their own data to researchers, and companies have sued saying that it’s a violation of the terms of use,” Reich says. “But researchers have to see beyond the curtain if we’re really gonna understand what’s going on. It’s an interesting space in that you have a product [available] to the populace, and especially to minors, that doesn’t have a lot of oversight and monitoring or understanding.”

[ad_2]

Source link

Online Courses update
Logo