Gating What and When Youth Can Access Social Media Face Court Rulings
Teen use of social media is prevalent and some teens say is constant amid growing concerns about its indelible influence. Efforts at the federal and state levels to gate what young people can access online have been mixed, and when successful tied up in court.
The 2023 Oregon legislature considered three different measures that all failed to pass. Washington lawmakers advanced a bill to place existing web-based media literacy resources in schools.
There is little debate that being routinely on social media can contribute to mental health stress. A study by the Centers for Disease Control and Prevention found 42 percent of high school students surveyed said they experienced persistent feelings of sadness or hopelessness. According to the study, 22 percent of high school students seriously contemplated suicide, with one in four young women going as far as to formulate a plan on how they would carry it out.
“This is a reality that we don’t have to accept,” says Senator Chris Murphy, D-Connecticut, one of four bipartisan sponsors of the Protecting Kids on Social Media Act. “The alarm bells about social media’s devastating impact on kids have been sounding for a long time, and yet time and time again, these companies have proven they care more about profit than preventing the well-documented harm they cause.”
The U.S. Surgeon General has weighed in with a report calling for minimum age requirements and limited access to social media “for all children”. Data indicates 90 percent of teens between 13 and 17 have used social media. More than half say they access social media daily.
Consensus on the problem but no consensus on how to combat it.
Questions About Best Approach
However, even advocates of doing something about teens and social media aren’t sold on some of the legislative proposals, including ones that require parental consent before young people can access social media.
Advocacy groups including Common Sense Media, Fairplay for Kids and the Center for Digital Democracy say legislative proposals may be well-intentioned but are unrealistic and even potentially dangerous.
“By requiring parental consent before a teen can use a social media platform, vulnerable minors, including LGBTQ+ kids and kids who live in unsupportive households, may be cut off from access to needed resources and community,” the groups said in a joint statement.
Some groups such as Common Sense Media question whether proposals violate the privacy rights of young people. “This is a life or death issue for families and we have to be very careful how to protect kids online,” says James P. Steyer, CEO of Common Sense Media. He prefers putting the burden on social media companies to ensure a safe space in the internet for young people and avoid making governmental regulation the middle man between parents and children.
California’s Online Privacy Bill
State lawmaking has progressed further only to run into courts that have blocked their implementation. The California Assembly approved a bipartisan children’s online privacy bill aimed at regulating access to social media and online video games. Hailed as a potential national model, the bill requires social media and video game developers to design their platforms with children’s “well-being” in mind and barring eight common data-collection practices.
However, companies including Meta and TikTok convinced a federal judge the measure may violate the First Amendment. Bills in Arkansas and Texas that require parental control also have been held up in court.
The legal battle over the California statute may be the bellwether for how to structure protective measures. Backers of the California bill insist it doesn’t restrict access to content but instead requires social media platforms to apply the highest privacy settings for young children. California Governor Gavin Newsom signed the bill, hailing it as “aggressive action to protect the health and well-being of our kids”.
NetChoice, a technology trade group, challenged the California law, arguing that it would require owners of online platforms “to sanitize the internet on behalf of young people” and thus may violate First Amendment freedom of speech.
Oregon lawmakers looked bills to impose age verification to access pornographic sites (SB 257) and require online platforms to identify, evaluate and mitigate risks to child from product features (SB 196) and require the Oregon Health Authority to study the effects of social media and cellphone use on young people (HB 3071).
Utah passed a measure amending the state’s existing Social Media Regulation Act to prohibits a social media company from using a design or feature that causes a minor to become addicted to its social media platform. Effective next year, Utah also requires social media companies to verify the age of someone opening or maintaining an account and parental consent for users under age 18. Parents also would have the right to access their children’s accounts and minors would be blocked from accessing their accounts from 10:30 pm to 6:30 am.
The U.S. Supreme Court has agreed this term to review two social media cases involving the First Amendment, though neither case directly touches on social media regulation for youth. It will take time before lower court decisions on various state laws make their wat to the high court.
Education Week Perspective
Education Week reported earlier this year that nine states considered some type of legislation relating to social media use by minors. The report said state and federal legislation under consideration centered on three objectives:
“Compel social media companies to verify users’ ages; bar social media companies from using algorithms to recommend content to young users; and restrict minors from using social media either through age requirements, parental permission requirements or curfews and time limits.”
A new trend involves school districts suing social media companies over harm online platforms pose to young people’s mental well-being.
Meta, TikTok and Snap assured Education Week in statements they take youth safety and well-being seriously and are developing tools to promote safe and healthy use of their online products by young users. Those protections include more parental control options, screen time management tools and age-verification features.
Meta is still smarting from a whistleblower claim in 2021 that the company suppressed research results indicating that Facebook and Instagram affected young people’s mental health and its own failure to take remedial action.