By The Texas Tribune via The Associated Press
Parents of Texas children under 18 can now monitor and restrict their child’s activity on digital platforms including Facebook and Instagram — but only if they know their child uses the service.
Meta, the parent company of Facebook and Instagram, rolled out parental control features in Texas last week to comply with House Bill 18, the Securing Children Online Through Parental Empowerment Act, which went into effect Sept. 1. The Legislature passed it last year to restrict kids from seeing harmful material on the internet, such as content promoting self-harm or substance abuse, while also giving parents more power to regulate what their child does online.
Meta’s tools allow parents who can prove their identity with a valid ID to see and update their teen’s account settings, set time limits on the child’s usage and even delete a minor child’s Instagram or Facebook account altogether.
Parents rights advocates say the new tools are helpful but don’t go far enough to protect young people online.
“It will be hard to intervene unless you know your kid is using the product,” said Zach Whiting, a policy director and senior fellow for The Texas Public Policy Foundation who testified in favor of the law. He said a stronger policy would restrict teens under 18 from creating a social media account to begin with unless they first obtained parental consent. Most social media companies already restrict children under 13 from creating an account.
“If we treat social media like any other harmful product, there are age verification requirements for those, like smoking and drinking,” Whiting said. “I think it’s an appropriate extension to do that for social media.”
Texas is among a growing number of states that have passed laws limiting tech companies’ interactions with children, citing research that found a link between social media use and negative psychological well-being among youth. Texas lawmakers also raised concerns about the vast amounts of data tech companies could be collecting from minors.
But, like those other states, Texas faced legal challenges and pushback from the tech industry, which was able to limit the scope of the legislation.
An earlier version of HB 18 would have barred minors from creating social media accounts unless their parents consented. That version did not pass the state Senate.
Rep. Shelby Slawson, R-Stephenville, who introduced the bill, told colleagues on the House floor last May that she had hoped to spend more time working with the Senate to tweak the bill but that there wasn’t enough time. Still, she said, “this bill is a monumental step in the right direction.”
Days before the law was set to go into effect, a federal district judge temporarily blocked a major piece that would have required digital service providers to filter from minor’s feeds such harmful content as material featuring self-harm, substance abuse, eating disorders or child pornography. The judge called those restrictions “unconstitutionally vague” and wrote that they could even block kids from seeing useful information.
“In its attempt to block children from accessing harmful content, Texas also prohibits minors from participating in the democratic exchange of views online,” Judge Robert Pitman wrote in his opinion. “A state cannot pick and choose which categories of protected speech it wishes to block teenagers from discussing online.”
Attorney General Ken Paxton has filed a notice to appeal Pitman’s ruling, which stems from a case filed by tech industry groups. A free speech advocacy group has also filed a lawsuit to block the new law.
“Nobody with a working knowledge of the First Amendment would say ‘oh, this is a bill designed to pass constitutional scrutiny,’ ” said Ari Cohn, a Chicago-based attorney who specializes in the First Amendment. “It’s obviously so over-broad and infringing on First Amendment rights.”
While those lawsuits play out, portions of the law are enforceable, including the requirement that companies create tools for parents to monitor their child’s accounts. The law also prohibits digital service providers from disclosing minors’ data or personal identifying information, or displaying targeted advertisements to them.
Meta does not share or sell personal data, a spokesperson said, adding that the only information used to show teens ads is their age and location, which helps the company make sure they show teens relevant ads for products and services available where they live. The company will no longer store the precise geolocation data associated with teen accounts in Texas in order to comply with the new law, the spokesperson said.
Other companies, including Snap and TikTok, did not respond to The Tribune’s inquiries, so it is not clear if and how they are complying with the new data and advertising requirements.
Snap offers tools for parents to restrict their teen’s account, but the teen would have to opt into the supervision. Since 2020, TikTok has also offered a family pairing setting, which would allow a parent or guardian to link an account to a teens’ and manage privacy settings and set screen time limits. This feature also requires the child to consent to the pairing.
It is also not clear how Paxton’s office intends to enforce the law. The consumer protection division of his office has sole authority to enforce the law. Violators could face civil penalties of up to $10,000 per violation and attorney’s fees. His office did not respond to The Tribune’s request for comment.
Tags