Dobrila Vignjevic / Getty Images

A bill making its way through the Maryland General Assembly could have major implications for how social media companies handle children’s consumer privacy nationwide, and a Montgomery County delegate is behind it.

“This bill is really creating a framework that requires social media companies to be able to design their products while thinking through the lens of harm that could potentially be done to children,” said bill sponsor Del. Jared Solomon (D), who represents District 18 which includes parts of Silver Spring, Bethesda and Wheaton.

The legislation mirrors a child internet safety bill passed and signed into law in California last year, which was modeled after landmark laws in the U.K. known as the Age Appropriate Design Code.

The primary focus of the bill, which is co-sponsored by Del. C.T. Wilson (D-Charles County), is to protect children’s privacy and prevent children from being inundated with harmful or inappropriate content that they weren’t looking for. This could result in tech platforms limiting autoplay videos for children, like YouTube has in the U.K. Another example is requiring new social media accounts to be set as private as a default.

It would also implement strategies to prevent anonymous adults from contacting children online.

If passed, the Maryland legislation, would:

  • Require a business that offers an online product likely to be accessed by children to complete a data protection impact assessment
  • Prohibit a business from offering a certain online product before completing a data protection impact assessment
  • Require businesses to document certain risks associated with certain online products
  • Require certain privacy protections for certain online products
  • Prohibit certain data collection and sharing practices

Essentially, it’s a two-pronged approach – the bill would require tech businesses to assess the risks and gaps in their product, and then implement solutions, Solomon said.

Solomon said the aim is not to ban social media or even prevent children from using it. The idea is to make it safer, while acknowledging kids engage with it.

“The internet can be an incredible place, and a global place for good. For kids who may feel like they don’t have a safe space in their community or in their immediate world, the internet can be a haven,” Solomon said. “Social media can do a lot of great things. We just want to do it in a way that looks out for young people.”


The bill is cross-filed with complementary senate bill SB0844, co-sponsored by Sen. Benjamin Kramer (D-Montgomery) and Sen. Chris West (D-Baltimore and Carroll).

The current legislation provides a 90-day period for compliance after the effective date. Solomon said this is because the goal is not to penalize tech industry groups, but to work with them.

“We’re giving these companies the space to innovate. … This isn’t about fines and damages. At the end of the day, it’s about harnessing the positive power of the internet and making sure that’s what kids are able to access. We’re working with companies to get us to that place,” Solomon said.


While digital gaming platform Roblox put full-throated support behind the similar California law, not all tech industry groups are on board with this kind of legislation. NetChoice, a trade group that counts Google, Meta and Amazon as members, has sued to block California’s version of the age-appropriate design code law.

They argue that it violates companies’ constitutional right to make “editorial decisions” about how it moderates content.

Solomon said there’s a misconception from opponents of the bill and similar legislation that this will block access to content, which he says is not the case. The idea is that certain content would not be promoted to kids through social media algorithms. However, it doesn’t prevent kids from looking for the content.


For example, Solomon said he’s spoken with teens who had pro-eating disorder content pop up on social media sites without searching for it, simply because they may have liked a post related to a healthy recipe or a workout routine.

There’s also been cases of children accidentally clicking on posts that contain white supremacist content, without the child understanding the context, and then the child’s social media becomes filled with white supremacist posts, Solomon said. These incidents are the kind of thing this bill is trying to prevent, Solomon said.

During a press briefing about the bill last month, Lola Nordlinger, a 19-year-old Bethesda-Chevy Chase High School graduate, said it’s common for kids and teens to be inundated with undesirable content.


“So much blame is placed on the users. To me, this is such an unfair assessment. Social media algorithms are used to prey on users’ insecurities in order to keep them addicted. The algorithm is way smarter than we estimate. It knows how to keep us engaged,” Nordlinger said.

Solomon said this isn’t an age verification bill, and there’s no way to perfect way ensure that a child isn’t lying about their age, or that an adult isn’t posing as a child. However, Solomon said there are analytic approaches he hopes tech groups will utilize to increase security. For example, there is technology that can monitor a user’s activity to see if content they are accessing is reflective of their age, or making someone verify their birthdate multiple times, which makes it more difficult to lie about a fake age.

Solomon said his hope is this legislation will spark social media platforms and tech groups to make changes to how their products operate nationwide. And policymakers in New Mexico, Oregon, New York and New Jersey have introduced similar legislation.


“Anything that is working to keep us safer and protect our privacy is good news for all of us, including adults,” Solomon said.