California Lawmakers: Social Media Corporations On The Hook for Children’s Mental Health by Sophia Archos

A recent bill passed in California could hold social media corporations—namely, Tiktok, Instagram, Facebook, Snapchat, YouTube, and the like—accountable for the mental and physical health of the children who are active on their platforms. The bill—The California Age-Appropriate Design Code Act (“Act”) —requires that apps should be child friendly by default. Instead, social media users currently have to manually opt-out of potentially harmful settings—such as precise geolocation tracking, collecting personal information that can be sold to third parties, and allowing unfriended adults to privately message adolescents via their accounts. This means social media platforms must now consider the mental health and physical health of adolescent users when designing their apps—young females being the target audience. Under the Act, app designers must consider:

  • Configuring all default privacy settings on the app to the highest level or privacy;
  • Disclosing privacy policies, terms of service, and community standards in simple terms that adolescent-users can comprehend;
  • Completing assessments—Data Protection Impact Assessment—for products or features likely to be used by adolescent-users;
  • Submitting assessments for approval before the products or features are launched on the app; and
  • Abstaining from using personal information of children for purposes other than those expressly permitted by the user.

 

The costs of non-compliance are severe. Violators of the Act could:

  • Face injunctions on their products;
  • Be fined up to $2,500 per affected child for each violation; and/or
  • Be fined up to $7,500 per child if the violation was intentional.

 

If signed into law by Governor Newsom of California, the bill would go into effect in July 2024. Although it is unclear whether the Governor will sign or veto the bill, it is clear that California legislators are taking the health and safety risks of social media on children seriously. On one hand, eight of the ten most frequented social media sites in the world are headquartered in the state, which suggests that California not only has the opportunity, but the obligation, to regulate corporations that have a profound global impact. On the other hand, social media connectivity is inevitable in the digital age, where iPads are integrated into elementary school curriculums and educational apps are used to practice math, reading, foreign language, and the like. Therefore, it is time that California set the standard for other states to pass regulations that protect social media users, especially children who cannot be expected to understand the risks or repercussions that come with giving away private information by “accepting the terms and conditions.”

 

So far, California is doing just that. For instance, earlier this year, the California State Assembly proposed The Social Media Platform Duty to Children Act, which would hold social media companies responsible for harming children that become addicted to their apps. In effect, it would allow parents to sue for up to $25,000 per violation. While the bill did not pass in the Senate, the speed at which these measures were proposed is undoubtedly encouraged by the increasing number of scientific studies and academic publications discussing the damaging effects of social media on the developing minds of adolescents. Most of these studies focus on Gen Z, and the prevalence of mood disorders such as anxiety, depression, and self-harm among that demographic. For example, one study found that from 2010 to 2014, the rates of hospital admission for self-harm doubled for girls ages 10 to 14. However, the same study showed that such rates did not increase for boys or young men as well as women in their early 20s. Other studies have demonstrated that this is not an isolated or “American” issue, despite what some critics say. For instance, in 2017 British researchers asked 1,500 young people (aged 14-24) to rate social media platforms according to how they affect their anxiety, loneliness, body image, FOMO, bullying, and sleep—overall, 14 categories collectively referred to as “well-being measures.” The results placed Instagram as the most harmful, followed by Snapchat, Facebook, then Twitter. YouTube was viewed as the most positive. A separate experience confirmed these results when researchers instructed young women to use Instagram, Facebook, or play a matching game for several minutes. The experiment concluded that those who used Instagram, more so than Facebook, showed decreased body satisfaction.

 

Ultimately, the prevailing evidence that social media has a negative impact on mental health, which often extends to physical health, supports that legislation like the Act is just as much healthcare legislation as it is consumer protection. Moreover, such legislation extends at the state level what the Children’s Online Privacy Protection Rule (“COPPA”) has done at the federal level—that is, protect the privacy and personal information of children under the age of 13 on online platforms. There is also increasing intersectionality between social media data collection and Health Insurance Portability and Accountability Act (“HIPPA”) violations. Recently, Meta (Facebook’s parent company) violated HIPAA by enabling a tracking tool that sent sensitive health information, including patient health conditions, doctor appointments, and medication allergies, from US hospitals to Facebook. As a result, two class action lawsuits were filed in California earlier this year.

 

Based on this information, California is on the frontlines of a data privacy war against major social media corporations. Many such corporations that oppose the bill argue that differing state laws regulating their apps would make compliance difficult. However, instead of social media corporations maintaining the status quo, which has wreaked havoc on the well-being of the global youth, I propose other solutions:

  • Each State could pass individual legislation regarding adolescent social media usage, for which nationwide social media corporations would have to comply with the laws of the strictest state, functioning as a statutory floor;
  • Plaintiffs could bring claims under The California Age-Appropriate Design Code Act, or other similar acts, and litigate up to the Supreme Court, which could potentially establish a nationwide precedent on the issue; and/or
  • More intensive acts than the Children’s Online Privacy Protection Rule (“COPPA”) could be proposed in Congress that regulate the standards of, g., condensed and simplified terms of service, geolocation tracking of children, and the remedies made available to adolescent users and their parents, such as invalidating adhesion consumer contracts with binding arbitration clauses.

What is clear is that the health needs of children utilizing social media are not currently being met, and it’s time for our legislative representatives to take action.

Leave a Reply

Your email address will not be published. Required fields are marked *