Social media regulation has been the main driver of legislative action in statehouses across the country for the last two years. Bipartisan bills have gained traction in Nevada, Indiana, Wyoming, South Carolina, New York, Texas, and Virginia to either mandate age verification, regulate app-store downloads, or ban the use of algorithmic feeds. Virginia lawmakers floated the restriction of algorithms being used to curate social media feeds for users under 18, but a compromise was reached in Richmond to instead enact time limits on social media. The bill passed unanimously by the VA House and Senate sets a one-hour cap for users under 15 years of age, allowing for a parental override feature so parents can grant more time as they see fit.
This is a compromise for almost every party involved in the debate over social media in society, but it is worthwhile.
As a matter of principle, the most important part of this debate is that Virginia honors the role of parents’ decision-making under their roof as it pertains to the well-being of their kids. Any effort to place politicians in the position of “knowing what’s best” is a problem and unlikely to win support from both Republicans and Democrats.
Consumers want to be empowered in their use of tech and social media services, and the latest iteration of the Consumer Data Protection Act begs the question for analysts and choice advocates like myself – who is the consumer?
By and large, the consumer is the parent, the person paying for family phone plans and individual devices used in their home. Pew Research did the last comprehensive study on who foots the phone bill in 2010, at the dawn of the smartphone era, and it showed that only 10 percent of teens held their own cell phone contracts, while 70 percent were on family plans paid by a parent. It is unlikely that this dynamic has changed much since.
Teen mental health has undeniably been in freefall for much of the last decade, leaving parents, advocates, and lawmakers looking for direct causation and a solution. Cutting down on teen’s online time has been a popular rallying cry, fueled by the work of Jonathan Haidt in his book, The Anxious Generation. But efforts to kick teens off social media, enact age verification, and more content moderation have run headlong into the wall of free speech law and technical limits, and resulted in losses in court for the most strident legislation.
This was both predictable and the right outcome.
Virginia Sen. Schuyler VanValkenburg (D) first introduced the time-limits bill as a ban on algorithmic feeds, but its shortcoming was that algorithms are both the problem and the solution for negative content online. The same technology that allows social media companies to serve up a steady stream of extreme political content to users based on what they typically engage with, is the same tech that allows them to curate mostly kid-friendly content. No algorithm means a chronological feed filled with whatever has been posted most recently, appropriate or not.
VA lawmakers want to help parents take the reins more effectively with how their kids use devices. The latest research shows that children who spend more than three hours a day on social media experience significantly higher rates of mental health struggles, and a 2023 Gallup poll shows that teens spend almost 5 hours a day on these apps.
It has to be said that Apple and Google, who lead the smartphone market, both have very effective family moderation tools that allow the parent to set limits on time, content, and contacts with iPhone and Android devices. Amazon’s tablet devices have the same. In my home, we graduated our 14-year-old from the Bark Phone to a standard iPhone, which grants us a similar degree of control over the child’s device.
We approve all app downloads through a Request Download feature, set limits on screen time, and shut off apps we see as unproductive. Google’s Family Link does much of the same. Instagram teen accounts already have parental permission features, and TikTok has parental controls through “Family Pairing”, but this requires they download the app themselves, which many will not do for reasonable data security reasons.
Elected officials are trying to meet the demands of voters who aren’t happy with how social media has impacted their kid’s lives, or are worried about larger trends they see on the news. But the fact remains that the tools parents need already exist.
It’s possible that a default one-hour time limit on apps will nudge parents toward learning about the control features so they can haggle about screen time with their children and decide for themselves. And that’s a net positive.
Social media regulation isn’t something that shouldn’t happen, but it should be done in ways that preserve both the experience of these apps and minimize the data collection needed to perform age verification. Virginia might be headed in the right direction.
Stephen Kent is the Media Director for the Consumer Choice Center and a resident of Manassas, Virginia. You can follow him on X @stephenkentx