The up to date Kids’s On-line Security Act goals to handle unintended penalties
A bipartisan pair of senators on Tuesday reintroduced the Kids Online Safety Act with updates aimed at addressing concerns the bill could inadvertently cause more harm to the young internet users it seeks to protect. However, some activists who have addressed these issues say the changes are still insufficient.
The bill aims to make the internet a safer place for children by making social media companies responsible for preventing and mitigating harm that could be caused by their services. The new version of the bill defines a list of harms that platforms must take reasonable steps to mitigate, including preventing the distribution of posts that promote suicide, eating disorders, substance abuse and more. These companies would have to undergo annual independent audits of their risks to minors and require them to enable the strictest child privacy settings by default.
related investment news
Congress and President Joe Biden have made it clear that protecting children online is a key priority, and KOSA has become one of the leading bills on the issue. KOSA has compiled a long list of more than 25 co-sponsors, and the earlier version of the bill passed unanimously by the Senate Commerce Committee last year. The new version of the law has been supported by groups including Common Sense Media, the American Psychological Association, the American Academy of Pediatrics and the Eating Disorders Coalition.
At a virtual news conference Tuesday, Sen. Richard Blumenthal, D-Conn., who introduced the bill along with Sen. Marsha Blackburn, R-Tenn., said Senate Majority Leader Chuck Schumer, DN.Y., said “one hundred” be percent behind this bill and efforts to protect children online.”
Blumenthal, while acknowledging that it is ultimately up to Senate leadership to set the timing, said, “I hope and expect that we will vote in this session.”
A Schumer spokesman did not immediately respond to a request for comment.
Late last year, dozens of civil society groups warned Congress against passing the law, claiming it could further endanger young netizens in various ways. For example, the groups feared the bill would increase pressure on online platforms to “moderate excessively, including from attorney generals trying to make political points about what kind of information is appropriate for young people.”
Blumenthal and Blackburn made several changes to the text in response to criticism from outside groups. They wanted to adjust the legislation more carefully to limit due diligence requirements for social media platforms to a specific set of potential mental health harms based on evidence-based medical information.
They also added safeguards for support services like the National Suicide Hotline, substance abuse groups and LGBTQ+ youth centers to ensure they are not unintentionally handicapped by the bill’s requirements. Blumenthal’s office said it didn’t think due diligence would apply to these types of groups, but decided to clarify anyway.
But the changes have not been enough to mollify some civil society and industry groups.
Evan Greer, director of digital rights nonprofit organization Fight for the Future, said Blumenthal’s office never met with the group or shared the updated text before it was rolled out, despite multiple requests. Greer acknowledged that the co-sponsors’ offices had met with other groups, but said in an emailed statement that “they appear to have deliberately excluded groups that have specific expertise in terms of content moderation, algorithmic recommendations, etc.”
“I have read through it and can state unequivocally that the changes made do NOT address the concerns we raised in our letter,” Greer wrote. “The bill still includes a duty of care that covers recommendation of content and still allows attorneys general to effectively dictate which content platforms can recommend minors,” she added.
“The ACLU remains staunchly opposed to KOSA because, ironically, it would expose the very children it is trying to protect to increased harm and surveillance,” Cody Venzke, the ACLU’s senior policy counsel, said in a statement. The group joined the letter that warned against its passage last year.
“KOSA’s core approach continues to threaten the privacy, safety and freedom of expression of minors and adults alike, turning off platforms of all stripes to monitor their users and censor their content under the guise of ‘duty of care,'” added Venzke . “To do that, the bill would legitimize the platforms’ already ubiquitous data collection to determine which users are minors if it were to try to curb this data abuse. In addition, parental guidance is vital in the online lives of minors, but KOSA would mandate surveillance tools without regard to the minor’s home situation or safety. KOSA would be a step backwards in making the internet a safer place for children and minors.”
At the press conference, Blumenthal responded to a question about Fight for the Future’s criticism that due diligence had been “very deliberately narrowed” to target specific harms.
“I think we’ve been very direct and effective in meeting that type of proposal,” he said. “Of course our door remains open. We are ready to hear and talk to other suggestions. And we’ve spoken to many of the groups that have been very critical and some have actually dropped their opposition, which I think you’ll hear in response to today’s session. So I think our bill is clarified and improved in a way that meets some of the criticism. We will not solve all the world’s problems with a single bill. But we’re making a measurable, very significant start.”
The bill also drew criticism from several groups that receive funding from the tech industry.
NetChoice, which sued California for its Age-Appropriate Design Code Act and is a member of Google, Meta and TikTok, said in a press release that despite lawmakers’ attempts to address concerns, “unfortunately, as this law would work in practice, an age verification mechanism and data collection on Americans of all ages is still required.”
“Finding out how young people should use technology is a difficult question and has always been best answered by parents,” said Carl Szabo, NetChoice’s vice president and general counsel, in a statement. “KOSA is instead setting up an oversight board of DC insiders to replace the parents and decide what’s best for kids,” Szabo added.
“KOSA 2.0 raises more questions than it answers,” Ari Cohn, a free speech consultant at TechFreedom, a think tank funded by Google, said in a statement. “What constitutes a reason to know that a user is under the age of 17 is completely unclear and not defined by the bill. In the face of this uncertainty, platforms clearly need to age-verify all users to avoid liability – or worse, to avoid any knowledge and leave minors without any protection.”
“Protecting young people online is a widely shared goal. But it would go against the goals of bills like this to impose compliance obligations that undermine teen privacy and safety,” said Matt Schruers, president of the Computer & Communications Industry Association, whose members own Amazon, Google, Meta and Twitter. “Governments should avoid compliance requirements that would force digital services to collect more personal information about their users — such as geolocation information and government-issued identification — especially when responsible businesses take steps to collect less data about customers and save.”
Subscribe to CNBC on YouTube.
WATCH: Senator Blumenthal accuses Facebook of taking over Big Tobacco’s playbook
Comments are closed.