492
Growing concerns about social media use allegedly causing mental health problems in young people have spurred a bipartisan push in Congress for the Kids Online Safety Act (KOSA), an ill-considered piece of internet censorship legislation.
First introduced in the US Senate in 2022, KOSA (S. 1409) has gone through multiple revisions and gained the support of more than sixty-two senators despite criticism from digital rights advocates, child safety organizations, and civil rights groups. The most recent version of the bill would impose a “duty of care” on platforms requiring them to mitigate possible harms to minors, such as cyberbullying, eating disorders, substance abuse, and sexual exploitation. It mandates that underage social media users be permitted to opt out of algorithmic recommendations and turn off potentially addictive features of platforms while also providing parents with tools to protect their children. The latest version of KOSA assigns responsibility for enforcing the bill’s “duty of care” provision to the Federal Trade Commission.
A companion House bill introduced in May, H.R. 7891, parallels the Senate bill in most respects but would impose the strictest “duty of care” obligations only on “high impact” social media, messaging, and video game platforms with more than $2.5 billion in annual revenue or more than 150 million monthly users.
In a February 15, 2024, article for the Electronic Frontier Foundation (EFF), Jason Kelley, Aaron Mackey, and Joe Mullin argued that updates to S. 1409 weren’t enough to fix its core First Amendment issues. The authors claimed the bill would endanger LGTBQ youth, young people seeking mental health information, and many other at-risk communities. When KOSA was first introduced, it was opposed by advocacy groups such as GLAAD and the Human Rights Campaign for similar reasons. EFF contends that “KOSA remains a dangerous bill that would allow the government to decide what types of information can be shared and read online by everyone.”
Kelley, Mackey, and Mullin noted that because there is no case law defining “reasonable care,” KOSA would put platforms in a compromising position for hosting otherwise legal content on their websites, such as information about support groups for vulnerable and marginalized youth and suicide prevention resources. Moreover, they argued that KOSA mandates that platforms “restrict access to content based on age,” forcing them to adopt some sort of age verification system.
In her analysis of H.R. 7891 for EFF, Molly Buckley noted that, though the House bill would limit liability for the least popular platforms, it “still incentivizes large and mid-size platforms?.?.?.?to implement age verification systems that will threaten the right to anonymity and create serious privacy and security risks for all users.”
Corporate outlets, such as the Washington Post and New York Times, have covered successive iterations of KOSA but have not examined thoroughly the implications of provisions such as its “duty of care,” which EFF has called a “duty of censorship.” Independent, technology-oriented news sites, such as Techdirt, have investigated those implications in greater detail and centered young people’s voices in their coverage.
Jason Kelley, Aaron Mackey, and Joe Mullin, “Don’t Fall for the Latest Changes to the Dangerous Kids Online Safety Act,” Deeplinks Blog (Electronic Frontier Foundation), February 15, 2024.
Molly Buckley, “The U.S. House Version of KOSA: Still a Censorship Bill,” Deeplinks Blog (Electronic Frontier Foundation), May 3, 2024.
Student Researcher: Vincenzo Champion (City College of San Francisco)
Faculty Evaluator: Jennifer Levinson (City College of San Francisco)
More/Source: https://www.projectcensored.org/kids-online-safety-act-first-amendment/