Children are a healthy part of the digital economy, keeping them safe online is no easy task. As we know, parents are not always there to monitor their children’s activity online and curtailing the risks on devices is daunting as keeping them safe in the physical world. A child may click on a gaming app that contains a suspicious link or downloads a malware from an app without an understanding of the harm the content may cause, including the ability to have two-way communication with a minor.
Child safety groups are dedicating considerable resources to raise awareness about these challenges parents face in the digital environment, and major tech platforms have created key controls that parents can implement to better protect their kids. However, even with these online safety tools and educational efforts, as the National Center on Sexual Exploitations (NCOSE) “Dirty Dozen List” notes, it is still far to easy for children to be exposed to unexpected sexually explicit content and predatory grooming.
As has been recently documented by investigative reports and whistleblower accounts, not all tech platforms are acting in their young users’ best interest, especially those fueled by consumers data through targeted algorithms. For example, on Reddit, prostitution, sex trafficking and child sexual abuse material is easily found and illegal drug sales on TikTok have been well documented — content that is readily accessible by children. Instagram recently announced plans to delay the development of a version of the app designed for kids, relenting in the face of pushback from Congress, civil society and child safety groups about the potential harm to younger viewers. The announcement preceded a Senate hearing on children’s online safety, at which Senator Marsha Blackburn suggested that Apple had stood up to Facebook to protect users from harmful content on the social media platform. “Facebook knew about content devoted to coercing women into domestic servitude, yet they chose to do nothing to stop it, until Apple threatened to pull Facebook from the App Store,” Blackburn told Facebook’s witness.
Apple’s Facebook intervention was just one instance where the iPhone maker’s exacting policies prevented destructive content from appearing on its App Store. Apple released a report earlier this year detailing many more incidents of combating fraud and harmful contend through the App review process. In 2020 alone, Apple prevented nearly 1 million problematic software applications from reaching users and stopped more than $1.5 billion in potentially fraudulent transactions from occurring. The extensive app review process, which includes human review, makes Apple’s App Store a safer place for all users, but there are additional safeguards for children that Apple could make, working with child safety advocates, to further improve child safety. One very positive example of Apple working with online child safety groups to curb the spread of child sexual abuse material was its recent announcement to “help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM),” a proactive approach to protecting children. These new Apple features, widely applauded by child safety advocates, will undoubtedly help prevent online abuses and enable parents to play a more informed role in how their children navigate technology.
Therefore, it’s surprising to see Congress propose legislation that would allow for the circumvention of these types of child protection mechanisms built by industry. The antitrust bills passed out of the House Judiciary Committee in June, would require mobile devices to allow any software or “app” to be installed without any type of child safety defaults, putting children’s safety at risk.
Last week, a bipartisan group of senators announced they would be submitting the Senate version of the American Innovation and Choice Online Act, as a companion piece of legislation for the House’s self-preferencing bill. These bills introduce a host of security problems that are unfortunately being presented to lawmakers as competition solutions. Their various shortcomings and contradictions include interoperability requirements that wouldn’t work without federal privacy legislation, and even more concerning, a sideloading mandate that would expose children to unvetted software and the attacks associated with it.
Much of the world is moving into the digital economy where mobile technologies are the center of their connections. This includes work, education, and commerce for adults, but what about the protections we need for kids? Mobile devices have become the reference points we use to manage our time and schedule, communicate with other people, stay connected with friends and family, and purchase products and services. With this level of usage has come a need to develop ways to limit what apps younger users have access to and ensure that their safety and privacy are protected.
It’s easy to see how mobile devices can take up time, it’s why we need to give parents easy to use tools to help manage these devices to ensure kids have time for learning, sleep, and other family responsibilities. Forcing companies to open their online app stores to any software company without a security or child safety standard means dissolving the highly curated process designed by both technical engineers and privacy advocates to build these tools. Parents should be allowed to set boundaries for younger children regarding their digital interactions and know the apps they use are safe for children’s age.
Demanding through legislation an “open market” for apps means users may lose the security process that is in place now. Consumers will not understand these app security and child safety measures have been degraded by government mandate to allow any software on any device without any gating mechanisms in place. This idea is modeled after the European Commission’s Digital Markets Act that is to use regulatory power to develop a small business market for mobile apps. The irony is that it’s the development of the app store concept that create the mobile marketplace for multiple apps to be easily viewed by consumers, not government regulation. Small developers make up over 90% of Apple’s app store participants.
Now Congress is considering a similar proposal that insists on unvetted apps to be sold right next to apps have been evaluated for appropriateness for their audience and passed security standards. The Open App Markets goal may be to enhance small business, but the result would undermine the process that develops a safe buying environment for consumers. We now see lawmakers backing legislation that would give away security standards in the name of “competition” and easier access criminals and malware that couldn’t make it through the gating process that is designed to ensure official app stores’ security and privacy measures.
Proposals on both sides of the Atlantic are pointing toward a world where checks and balances are taken away and replaced with easy access to anyone. This would be a world in which platforms cannot set these content-specific rules, nor filter out the multi-faceted cybersecurity threats posed by malware. It would also make it easier for kids to access adult content.
We are at a pivotal moment for the regulation of digital platforms. Lawmakers have an opportunity to invest in the importance of security at every level of our digital existence. One key part of securing our younger users is to build the toolbox for children’s online privacy and protection. To be successful in this effort, we need a balanced approach that doesn’t regulate away the current tools being used to keep young people safe online. Instead, they should engage in in-depth regulatory dialogue with key industry leaders to develop better online protection practices for everyone.
Rick Lane is a tech policy expert and the founder and CEO of Iggy Ventures. Iggy advises and invests in tech startups that can have a positive social impact.
Shane Tews is a nonresident senior fellow at the American Enterprise Institute (AEI), where she works on international communications, technology and cybersecurity issues, including privacy, internet governance, data protection, 5G networks, the Internet of Things, machine learning, and artificial intelligence.