Package of Bipartisan Measures May Receive Senate Floor Votes This Year
Washington Senator Maria Cantwell, who chairs a key Senate committee, sees a path to enact a package of online child safety bills this year and a federal data privacy measure next year. The federal data privacy measure may link to congressional efforts to regulate artificial intelligence.
Washington Congresswoman Cathy McMorris Rodgers, who now chairs the House Energy and Commerce Committee, supported a bipartisan federal data privacy bill that failed to pass last year. She said, “Comprehensive privacy legislation would establish the foundation necessary to give Americans control over how their information is collected, used and shared, whether for AI or something else.”
Senate Majority Leader Chuck Schumer endorsed a vote this year on online child safety measures, but cautioned the congressional calendar is cramped with votes on another stopgap funding measure to avoid a government shutdown, 12 appropriation bills and a White House supplemental funding request for Israel, Ukraine, Taiwan and southern border security.
Renewed Momentum for Child Safety
Online child safety legislation picked up momentum after Arturo Bejar, a former Facebook engineer turned whistleblower, testified in Congress that Meta ignored warnings about dangerous content accessible by young people on Facebook and Instagram.
Bejar said he witnessed his 14-year-old daughter and her friends on Instagram “experience unwanted sexual advances, misogyny and harassment.” He alleged Meta dismantled tools on its platforms to help teens deal with malcontent. Meta claims it has installed 30 tools “to support teens and their families in having safe, positive experiences online.”
“This is a life or death issue for families and we have to be very careful how to protect kids online.”
Bills Headed to Senate Floor
Senate Judiciary and Senate Commerce, which Cantwell chairs, have already approved a bill to eliminate liability protections for online platforms for content relating to child sexual exploitation and child pornography.
Cantwell’s committee approved a bill, with almost half the Senate as cosponsors, requiring online platforms and social media apps to exercise a “duty of care” by taking steps to mitigate harm for children below the age of 13 using their platforms.
Her committee also approved a bill with bipartisan cosponsors to prohibit online platforms from disseminating children’s personal information without obtaining verifiable parental consent, effectively ending targeted ads aimed at kids and teens 17 and younger.
Senate Judiciary has okayed a bill to allow child pornography victims to sue an internet company that conveys the pornography and to establish a board within the Federal Trade Commission to resolve complaints if an internet platform fails to take down pornography.
The committee also has moved two Republican-sponsored measures to require social media and online platforms to report incidents of sexual exploitation of children and to eliminate Section 230 liability protections for online platforms regarding third-party content showing child sexual abuse.
President Biden has signed into law a measure to curb online cyber and sexual threats to children by requiring the Justice Department to develop a strategy to counter child sexual exploitation that includes creation of task forces, technology, training and public awareness campaigns. The measure updates legislation passed in 2008 but ineffectively implemented. The Justice Department has released an updated strategy and appointed a permanent coordinator to oversee prevention of child sexual exploitation and trafficking.
Checkered History of Child Privacy
Congress has held numerous hearings and considered several bills over the past five years, but never passed any of them. Rogers said the House online child safety bill she worked on in the previous Congress was sidelined because then House Speaker Nancy Pelosi said its protections were weaker than state legislation approved in her home state of California.
In the absence of federal legislation, states have attempted to fill the void, though some efforts faced criticism from child safety advocates as “unrealistic” and others were challenged in court as unconstitutional by online platform companies.
Groups such as Common Sense Media question whether some measures violate the privacy rights of young people. “This is a life or death issue for families and we have to be very careful how to protect kids online,” says James P. Steyer, CEO of Common Sense Media. He prefers putting the burden on social media companies to ensure a safe space in the internet for young people and avoid making governmental regulation the middle man between parents and children.
The California Assembly approved a bipartisan children’s online privacy bill aimed at regulating access to social media and online video games. Hailed as a potential national model, the bill requires social media and video game developers to design their platforms with children’s “well-being” in mind and bar eight common data-collection practices. However, companies including Meta and TikTok convinced a federal judge the measure may violate the First Amendment.
Bills in Arkansas and Texas that require parental control of child social media use also have been held up in court.
Oregon lawmakers considered but didn’t pass bills to impose age verification to access pornographic sites (SB 257); require online platforms to identify, evaluate and mitigate risks to children from product features (SB 196); and require the Oregon Health Authority to study the effects of social media and cellphone use on young people (HB 3071).
The U.S. Surgeon General has weighed in with a report calling for minimum age requirements and limited access to social media “for all children”, citing data showing 90 percent of teens between 13 and 17 have used social media and more than half say they access social media daily.