Technology

Parents on Instagram, Facebook: Meta wants to talk to you about your child

In the midst of the second phase of a comprehensive child safety review in New Mexico, Meta is announcing new measures designed to ensure that youth on its campuses are subject to age-appropriate protections.

Meta announced in a blog post on Tuesday that US parents on two social media platforms, Facebook and Instagram, will receive detailed notices on how to check and verify the age of their teens on the company’s apps.

All users that Meta has identified as a parent, not just the adults supervising the Teen’s account, will receive a notification. The notice will include a link to a blog post Meta published last year about how to talk to young people about the importance of providing for their age.

Meta hopes to increase parental awareness of age verification on Instagram and Facebook.
Credit: Meta

Meta also announced that age detection technology will be rolled out to 27 countries in the European Union and Brazil. Additionally, the technology will work for US Facebook users for the first time.

In April 2025, Meta began using AI to identify younger users who listed an older age on their account. The technology reassigns those users to Meta’s Teen Account product, which the company says has strong security protections.

In the fall, independent experts who tested Teen Accounts published a report stating that the product does not work as advertised. Among their findings, the researchers documented instances where Guardrails failed to prevent inappropriate contact with strangers.

BREAKFUT:

EU says Meta did not do enough to stop children under 13 from using Instagram and Facebook

On Tuesday, Meta said its AI technology will begin analyzing user profiles for “content indicators” of their age, simplify the process of reporting suspected child accounts, and strengthen its ability to stop young users from opening new accounts.

Meta noted in its blog post that it believes lawmakers should require app stores to verify a user’s age and provide that information to apps and developers.

Meta is back on trial

Meta lost the first phase of a New Mexico lawsuit in March when a jury found the company liable for misleading consumers about the safety of its platforms and endangering children. The case was opened by the public prosecutor.

Meta was ordered to pay maximum fines for each violation of New Mexico’s consumer protection laws, totaling $375 million. The company said it plans to appeal the decision.

In the bench trial, the New Mexico Department of Justice is seeking relief requiring Meta to pay $3.75 billion in additional damages and make certain changes to protect the children.

The proposed policies include effective age verification, banning of children under 13, restrictions on the encryption of messages from the end of children, and permanent banning of adult users who engage in or promote child exploitation.

Last week, Meta threatened to shut down its platforms in New Mexico in response to the state’s demands.

“Many of the requests are technologically or impractical and would actually force Meta to develop separate applications for use only in New Mexico,” the company said in its court filing, according to the report. The guard. “Therefore, granting strict relief would force Meta to completely withdraw Facebook, Instagram and WhatsApp from the government as the only means of compliance.”

In court Monday, Meta attorney Alex Parkinson emphasized that, saying granting the state a full exemption would “make it difficult to continue to supply Meta products” to New Mexico.

State Attorney General Raul Torrez said Meta is putting advertising money and profits ahead of “children’s safety.”

“We know that Meta has the power to make these changes,” said Torrez in a statement. “This is not about technical ability.”

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button