Technology

New bill would require companies like Facebook, Google to add features that protect children

Sens. Marsha Blackburn, R-Tenn., and Richard Blumenthal, D-Conn., conduct a news conference in Capitol.

Tom Williams | CQ-Roll Call, Inc. | Getty Images

Two senators introduced a new bill Wednesday that would give online platforms a duty to act in kids’ best interests and prevent or mitigate the risk of certain harms including suicide, eating disorders and substance abuse.

The Kids Online Safety Act was introduced by Sens. Richard Blumenthal, D-Conn., and Marsha Blackburn, R-Tenn., respectively the chair and ranking member of the Senate Commerce subcommittee on consumer protection. It would have a significant effect on the design of platforms made by companies like Facebook parent Meta, Snap, Google and TikTok.

It comes after the subcommittee received thousands of pages of documents from former Facebook employee Frances Haugen, who also testified before the panel. The documents revealed in part that the company had researched its platforms’ impact on children and found negative effects on the mental health of some teen girls. Lawmakers who later confronted executives from Facebook, including Instagram chief Adam Mosseri, were outraged the company hadn’t done more to alter its services after the research findings.

The Kids Online Safety Act would raise the standards for online platforms that are “reasonably likely to be used” by kids aged 16 or younger to protect those young users.

It requires those companies to implement safeguards that minors or their parents can easily access to “control their experience and personal data.”

That includes settings that help them limit the ability of others to find minors online, restricting the amount of data that can be collected on them, allowing them to opt-out of algorithmic recommendations systems using their data and limit their time spent on a platform.

The bill also requires platforms to make the strongest version of these safeguards the default setting on their services. It prohibits services from encouraging minors to turn off those controls.

Covered platforms would need to release annual public reports based on an independent, third-party audit of the risks of harm to minors on their services. They would also need to provide access to data for researchers vetted by the National Telecommunications & Information Administration to conduct public interest research on the harms to minors online.

The bill also directs government agencies to figure out the best ways to protect minors on these services. For example, it directs the Federal Trade Commission to create guidelines for covered platforms on how to conduct market- and product-focused research on minors. It also requires the NTIA to study how platforms can most feasibly and accurately verify ages of their users.

The bill would create a new council of parents, experts, tech representatives, enforcers and youth voices, convened by the Secretary of Commerce to give advice on how to implement the law. It would be enforced by the FTC and state attorneys general.

Subscribe to CNBC on YouTube.

WATCH: How Facebook makes money by targeting ads directly to you

View Article Origin Here

Related Articles

Back to top button