Protecting Children under The Kids Online Safety Act (KOSA)
By Emily Muniz
The primary characteristic that differentiates contemporary youngsters from previous generations is the prevalence of screens. The widespread dissemination of smartphones and comparable technology, particularly with the launch of the first iPhone in 2007, has left nearly every aspect of our lives digitized. Numerous jobs have transitioned to hybrid operations, there has been an increase in educational programs conducted fully online, and mobile phones have become crucial to the average individual. While the lives of adults are undergoing significant modification, it has also triggered profound changes in the way in which children engage with the world around them.
Children have become increasingly reliant on, and possibly addicted to, the world of digital media. Research from the Pew Research Center indicates that “80% of parents say their child age 5 to 11 [...] uses or interacts with a tablet computer,” whilst parents of children under this age bracket estimate that approximately half interact with a computer or smartphone.1 The data reveals that technology has effectively become a standard across all age groups. Although many educational platforms have emerged online to assist children in developing critical comprehension and learning abilities at an early age, the increase in dependent children has simultaneously exposed them to the dangers of the internet. The dangers include socially explicit content, hate speech, misinformation, predators, frauds, and the addictive design elements of the platforms that impair the capacity of children to regulate their screen time. Exposure to social media at a young age is known to adversely affect mental health, leading to issues such as the early onset of eating disorders through social comparison, as well as anxiety, depression, and, in severe instances, suicide, particularly due to unmoderated cyberbullying.
The TAKE IT DOWN Act, recently enacted as S. 146 on May 19, in conjunction with the PROTECT Act (S. 151), seeks to strengthen penalties for individuals who engage in child-harming behavior, such as creating and distributing CSAM, while also holding platforms accountable for failing to regulate and remove CSAM. If implemented, KOSA would strengthen child protections by requiring platforms to modify their systems to limit harmful content to minors. Senator Blumenthal, the bill's sponsor, contends that it would improve online protections for children under 17 by...
Requiring “social media platforms to provide minors with options to protect their information, disable addictive product features, and opt out of personalized algorithmic recommendations.”2
Giving “parents new controls to help their children and spot harmful behaviors and provides educations with a dedicated channel to report harmful behavior.”3
Creating “a duty for online platforms to prevent and mitigate specific dangers to minors, including promotion of suicide, eating disorders, substance abuse, sexual exploitations, and advertisements for certain illegal products (e.g. tobacco and alcohol).”4
Ensuring “that parents and policymakers know whether online platforms are taking meaningful steps to address risks to kids by requiring independent audits and research into how these platforms impact the well-being of kids and teens.”5
Furthermore, because of the nature of the bill in holding Big Tech companies liable for their harmful algorithms that compromise the safety and mental health of kids, KOSA has enjoyed significant Bipartisan Support. KOSA was introduced by Republican Senator Blackburn and Democrat Senator Blumenthal in response to the significant lack of accountability demonstrated by social media companies and online platforms in protecting children. Since its creation, Blackburn and Blumenthal have effectively unified “62 members of the U.S. Senate” and secured endorsements from “over 240 organizations”.6 In the words of Blumenthal, Big Tech is “putting profits over safety” and must be held more liable.7
Effect of KOSA on “Covered Platforms”
KOSA imposes significant compliance obligations that will transform risk management techniques for covered platforms. This bill defines a covered platform as “an online platform, online video game, messaging application, or video streaming service that connects to the internet and is utilized, or is likely to be utilized, by a minor.”8 These platforms must implement appropriate measures, features, or software to eliminate patterns that foster "addiction-like behaviors," such as infinite scrolling, remove content that constitutes bullying or promotes physical violence, and mitigate "deceptive marketing practices."9 This requires that platforms not only perform risk assessments but also modify their systems to be more conducive to parental oversight, allowing parents to manage the algorithms their children are exposed to.
Enforcement Risks for Noncompliant Platforms
Should KOSA be passed, platforms that violate the rules enforced by the Federal Trade Commission and the Attorneys General of every state could suffer major consequences. Although KOSA does not now specify precise fine amounts, the FTC's monitoring of enforcement will affect the fines imposed on companies for noncompliance depending on the degree and type of violation. Maintaining current risk assessments, protecting private information of young users, and applying design changes to their platforms to ensure a safe online experience for children allows companies to easily follow KOSA's recommendations. Public companies also should expect investor scrutiny in line with Regulation S-K.
What’s to Come
Although the measure awaits more House of Representatives action, its strong bipartisan support and wide scope imply that corporate players must get ready for a new regulatory age. Should it be passed, KOSA will force platforms to comply legally, change their product designs, disclose risks, and address youth safety, so ushering in a new chapter in tech corporate responsibility.