Part I: Tell Your Parents…Always.
Before the (temporary) ban on January 19, the platform was in trouble for another reason: causing mental health issues in teenagers. And it brought states, attorney generals, and local governments together.
Texas vs. TikTok
Just as we were getting to know what our World Wide Web was capable of, they passed the Children’s Online Privacy Protection Act (COPPA) in 1998, then the Children’s Internet Protection Act (CIPA) in 2000. Both laws were put in place to help prevent mistreatment of minors online and give parents more control over what their kids were getting up to. However, as we moved into a digital society and more social media apps became prevalent, they were given an upgrade in 2023.
Texas became the first state to enact a law preventing harmful content from being shown to children with the Securing Children Online through Parental Empowerment (SCOPE) Act. With an effective date of September 01, 2024, SCOPE asked for some of the following:
- Service providers needed to register the ages of those creating digital service accounts to prevent age-altering later on.
- Services need to limit their use of and cannot share or disperse personal information of underage persons.
- Services need to come up with and facilitate plans to keep topics deemed harmful away from underage persons.
Which is where issues with TikTok began. Content was curated from catering to video feeds watched by the user. So, let’s say we are big fans of true crime and videos about missing people. The algorithm took this information, and now you are watching potential theories on why Maura Murray disappeared in 2004. But, as the viewer looks up more feeds involving the case, it was easy to fall into a rabbit hole of conspiracy theories and case details, which could be seen as a pitfall of the content curation.
At least, to attorney general Ken Paxton, it was. In October 2024, he sued the platform for alleged multiple violations of the SCOPE act, including the sharing names and usernames of private accounts with third parties, and, overall, not having enough tools for parents to monitor and limit the amount of content their children are watching (more on this later…) At the end of the lawsuit, Paxton requested that a civil penalty of $10,000 for each violation be charged to TikTok.
Some motions could be seen as done in in good faith. Namely, calling out sharing certain information without consent. Parents are aware of creeps pretending to be teenagers, and news of college students staying up all night to scroll through an endless fyp abyss, so it is understandable why they want to check on their kids and how their mental health is impacted by their apps. But the suit also fails to take into account parents who will use their control to limit knowledge of the outside world or, even worse, abuse their children with the knowledge of media texts they consume, and that it provided little to no solutions for the privacy for minors who found a communities when they didn’t have them at home.
Paxton doubled down in January 2025 when he filed a second suit. This time, the heavily-redacted law essay alleged harmful content was marketed to children as young as thirteen. This suit goes into much more detail into content distribution and even lists some of them (and we’ll give the PG version, just in case!) Here are a few:
- A man in a fast food corporation hat using colorful language to describe light ice aficionados.
- A woman dancing to a song about…the other name for a rooster.
- Another woman lip-syncing to a song about getting run over by lots of roosters.
- A woman lipsyncing about taking lots of white powdered sugar…roosters…well, you get the idea.
They list more examples in the actual case file, but essentially, they alleged TikTok knew about multiple violations of their own policies but did nothing about them. As a result, underage users were exposed to “psychologically damaging” content and, once again, demanded they pay for their actions.
The seeds for litigation were sewn.
New York, New Me
In October 2024, New York City attorney general Leticia James launched a suit of her own against the platform for promoting and causing mental issues like depression to minors. Interestingly enough, James’s writings agree with Paxton’s in regard to TikTok promoting inappropriate content to underage audiences. However, the suit focuses more on the app’s effects on mental health and its alleged causes of addictive behaviors.
For example, RETOUCH (formerly Beauty Mode) was a filter that, essentially, artificially remodeled a user’s face to make them look more attractive. Its success prompted the creation of Bold Glamour, which included the addition of more pronounced enhancements. The suit argued these filters caused low self-esteem, disorders, dysmorphia and a deep resentment of personal flaws.
Looking back on icks of middle/high school, We remember the days where we looked up celebrities and wished for their hair, lips, hips, etc. Now that content is accessible 24/7, obsessing over imperfections is, unfortunately, easier. But influencers have also found audiences by sharing their own insecurities and encouraging their viewers to embrace being unapologetically unique. Personal beauty brand Dove even responded with #TurnYourBack, which rejects the use of filters altogether.
Then, there was promotion of allegedly dangerous challenges. They cited an incident in 2023 where a 15-year-old boy from Manhattan was killed during a subway surfing attempt after his mother found videos related to the same topic on his TikTok For You page. In another example, they spoke about the Kia thefts, where users were shown someone hacking the models ignition and prompted others to attempt the same.
Once again, there are good challenges, many of them raising awareness for medical issues and positive goals like living healthier or overcoming hardships. However, there are particular life-altering (or life-ending) challenges need to be addressed, especially when they are done in the name of going viral or clout-chasing.
Overall, the suit alleged TikTok purposefully took advantage of how minors are more susceptible to endless scrolling because their brains are not fully developed yet. making them more susceptible to increased depressive episodes, sleep deprivation, and a lack of interest in beneficial activities like exercising and meeting with friends outside of their apps. Like the Texas suit, Wright demanded TikTok pay penalties of $5,000 per incident. In addition, they asked for the app to disgorge any income made from ads created through targeting minors’ personal information.
I Know What’s Wrong With Me…Maybe?
Finally, As the coronavirus raged and countries locked down, TikTok skyrocketed in popularity. And many trends, including self-diagnosis, also trended as a study done by University of Virginia in December 2022 said searches for symptoms of attention deficit hyperactive disorder (ADHD), borderline personality disorder (BPD) and obsessive compulsive disorder (OCD) increased with teens and young adults.
There are multiple pitfalls from self-diagnosis, of course. Although the internet is a vast expanse of knowledge, sources vary and some of them loads of misinformation and unnecessary stress. Help without a trained professional can also lead to incorrect diagnosis and overuse of medical terms to describe varied emotional responses of everyday suffering (i.e. underperformance at school, loss of loved ones, setbacks with a job) can turn a smaller issue into something more confusing and severe. But multiple professionals have also turned to the app to spread awareness and validation of symptoms, which helps communities come together and discuss their own experiences others can relate to.
Conclusion
Sometimes, we are what we consume. Social media overuse is never a good thing for anyone. And, with more proper awareness and media literacy aimed towards the younger, technologically advanced generation, many are trying to put down their phones for a while. At least, until the ban is over.