This story discusses suicide. If you or someone you know is having thoughts of suicide, please contact the Suicide & Crisis Lifeline at 988 or 1-800-273-TALK (8255).
Parents and school districts are increasingly launching lawsuits against social media companies for allegedly harming young adults' mental health.
Colorado school districts most recently joined a national lawsuit with over 200 plaintiffs against the parent companies of Facebook, TikTok, Snapchat and YouTube. The suits have been consolidated in the U.S. District Court in Oakland, Calif. The schools claim, in part, that students' time on social media has negatively impacted their behavior and has severely limited their attention span in class.
Denver Public Schools school board president Xóchitl Gaytán told Fox News Digital that they voted unanimously to join the lawsuit partly due to how social media was seemingly hurting both students and teachers.
"The research has shown that this constant and frequent use of social media leads to mental health issues within our young students, and those mental health issues can be like severe anxiety, depression, suicide," Gaytán told Fox News Digital. "So there's the mental health harm… And then another aspect that I was looking at is the very constant and almost abusive use of cell phones… There's the use of that in classrooms and how it cuts into classroom instruction time. It's really affecting teachers."
Social media organizations, she added, are "making it worse for working families" with their "destructive and manipulative algorithms," noting that was her own personal assessment.
"As educators we want our students to have a healthy, productive, and worry-free school environment," Dr. Dave Baugh, Aspen School District superintendent, said in a statement shared with Fox News Digital. "We believe social media has become destructive and is having a dangerous effect on our students. We are seeing increased levels of anxiety, depression and loneliness. It's causing ongoing distress and that's not good for students at home or at school."
"We are seeking monetary relief from each company. We believe these social media companies are willfully creating a public nuisance and operating with negligence," Baugh said of the school district's decision to join the national lawsuit. He added that the district has hired more counselors at each of their schools to deal with the epidemic and that the leadership team is considering a total ban of smartphones for students.
Digital safety expert Titania Jordan said her concerns as a mom drove her headfirst into a career in helping keep kids safe in the age of digital parenting.
"I'm a mom of a 14-year-old who has made so many mistakes," Jordan told Fox News Digital. "And based on what I've learned in trying to parent in a tech world, in an age of social media and smartphones, I want to help prevent other parents and other children from the pain and the harm that has been caused from within my own family and with hundreds of thousands of families across the nation."
Jordan is the Chief Marketing Officer and Chief Parent Officer of Bark Technologies, an online safety company that helps keep kids safe online and in real life. At Bark, which was launched in 2015, Jordan said she and her colleagues help protect around 7 million children across the country by using artificial intelligence to connect to their devices. When their algorithm detects a potential problem, be it predators, strangers, suicidal ideation, bullying, etc., they then alert parents and caregivers to the issue and offer recommended next steps.
"It's exponentially harder for parents to keep their kids safe," Jordan said. "Before tech, we had to worry about seat belts, and sunscreen and locking our doors, and ‘stranger danger’ conversations - the basics. All of a sudden, our children are spending upwards of eight hours a day with connected tech that lets them access the entire world, lets the entire world access them. That's terrifying. You know you, as a parent, wouldn't go drop your 6-year-old off at an international airport, or even the local mall by themselves. But we're doing that with smartphones and gaming consoles, and even family iPads that we think are safe, and they're not."
TIKTOK ACTS ON CHILDREN'S BRAINS LIKE ‘CANDY STORE,’ SHORTENING ATTENTION SPAN: REPORT
In addition to the unprecedented access children now have to the internet, she noted that the threat is also not very visible.
"Parents are at a major disadvantage right now," she said. Yet, over 1,000 families have now filed lawsuits against social media.
"Parents are waking up," Jordan said. "Cars didn't use to have seatbelts until enough people were harmed and there needed to be regulation. And cigarettes used to not have a warning label. It's no different with social media."
In their motion to dismiss the case, Facebook parent Meta Platforms, Snapchat owner Snap, Google parent Alphabet, and TikTok owner ByteDance said the harms alleged by the plaintiffs are protected under Section 230, which says that internet companies generally aren’t liable for third-party content on their sites. Jordan blasted Section 230 for providing "immunity" for Big Tech.
"And that's a problem when it comes to children being harmed, children losing their lives, children having eating disorders and being contacted by predators and even kidnapped by strangers," Jordan said. "The more lawsuits that pop up, the more quickly change can happen."
In a CBS News report about the families taking social media to court, attorney Matt Bergman is quoted as saying some social media products "are explicitly designed to evade parental authority." Jordan said she "100%" agrees with the statement, sharing a jarring anecdote about how her son was able to opt out of parental controls on TikTok.
"It infuriates me when I see a major platform release parental controls, a family safety center, and then I dig in and see what it actually entails, and it's smoke and mirrors," Jordan said. "And that's a nice way to put it. For example, TikTok has family sharing. And I got really excited when I heard that because my son has TikTok, and I have TikTok, and I was like, ‘Cool, let's set this up. Let's look this up, so I can make sure you can have a safer interaction with this platform that is fun and is engaging, but also has a lot of bad aspects to it, including a powerful algorithm.' So when I went through the process of setting it up, and we looked at the user interface on his end, he was able to unlink it. He was able to tap at the top right, click three dots and say, I would like to disconnect my account from my mother's account.'"
"He's 14," she continued. "He is not an adult yet. I put in these safety measures, I ‘trusted' TikTok for a moment, thinking, ‘Wow this is great.’ Not at all. It told me that within 48 hours, my son's account would be unlinked from mine. It didn't ask for a password, it didn't ask for my permission. It just told me that this was going to happen. That is unacceptable. And so many parents think that these platforms care."
A TikTok press release on World Mental Health Day Tuesday announced that the app was giving "greater access" to information about mental health to users.
"Starting from today, people in the United States will begin to have greater access to information about mental health directly from the TikTok app. When people search for terms linked to conditions or illnesses such as depression, anxiety, autism or trauma, among others, they will be directed to information provided by the National Institute of Mental Health and the Cleveland Clinic," according to the release.
Other social media companies targeted by parents and schools pushed back on allegations they are turning a blind eye to online dangers.
"Protecting kids across our platforms has always been core to our work," Google spokesperson José Castañeda said in a statement shared with Fox News Digital. "In collaboration with child development specialists, we have built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls. The allegations in these complaints are simply not true."
NEW YORK LAWMAKERS TARGET ‘ADDICTIVE’ SOCIAL MEDIA FEEDS VIEWED BY CHILDREN
The companies have also rejected the notion that they have ignored online threats to children.
"Protecting kids across our platforms has always been core to our work. In collaboration with child development specialists, we have built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls. The allegations in this complaint are simply not true," Castañeda said.
"Snapchat was designed differently from other social media platforms because nothing is more important to us than the well-being of our community," a Snapchat spokesperson said in a statement to Fox Digital. "Our app opens directly to a camera rather than a feed of content that encourages passive scrolling and is primarily used to help real friends communicate. We aren't an app that encourages perfection or popularity, and we vet all content before it can reach a large audience, which helps protect against the promotion and discovery of potentially harmful material. While we will always have more work to do, we feel good about the role Snapchat plays in helping friends feel connected, informed, happy, and prepared as they face the many challenges of adolescence."
A Meta spokeswoman recently told the media that the company wants "to work with schools and academic experts to better understand these issues and how social media can provide teens with support when they need it."
"We want teens to be safe online," Antigone Davis, Vice President, Global Head of Safety for Meta, said in an earlier statement. "We've developed more than 30 tools to support teens and families, including supervision tools that let parents limit the amount of time their teens spend on Instagram, and age verification technology that helps teens have age-appropriate experiences. We automatically set teens' accounts to private when they join Instagram, and we send notifications encouraging them to take regular breaks. We don't allow content that promotes suicide, self-harm or eating disorders, and of the content we remove or take action on, we identify over 99% of it before it's reported to us. We'll continue to work closely with experts, policymakers and parents on these important issues."
Fox News Digital has reached out to TikTok and Meta for further comment.
CLICK HERE TO GET THE FOX NEWS APP
Jordan she said she understands these are businesses, and they need to make a profit, but she suggested a simple solution would be to work with third party platforms so that parents can have actual insight into what's happening with their kids.
"Be patient and don't give up," Jordan said to parents. "You need to have candid conversations with your children at a much younger age than you might think and at a more frequent rate."
"Please do not give your child unmonitored, unfettered access to the internet," she offered as a final warning. "Use screentime controls and parental controls that come built in to whatever tech your child uses and consider using third party monitoring solutions like Bark that will alert you when your child is in danger so that you can take action before it's too late and tragedy strikes."
"Tech is not the problem," Jordan said. "It's the unregulation of it."
For more Culture, Media, Education, Opinion and channel coverage, visit foxnews.com/media.