Meta said today that it is putting in place new direct message (DM) rules on both Facebook and Instagram for teens that will stop anyone from messaging teens.
Up until now, Instagram didn’t let people over 18 message teens who don’t follow them. By default, the new rules will apply to all users under 16 and, in some places, under 18. Meta said that it would send a warning to current users.
Only Facebook friends or people in your contacts list will be able to write you on Messenger.
On top of that, Meta is improving its parental controls by letting parents decide whether to let teens change the usual privacy settings or not. In the past, when kids changed these settings, their parents were told, but they couldn’t do anything about it.
The company used the example of teens trying to change their account from private to public, change the Sensitive Content Control from “Less” to “Standard,” or change who can direct message (DM) them. In these cases, parents can block the user.
Meta added tools for parents to keep an eye on their kids’ Instagram use for the first time in 2022.
The big social media company also said it was going to add a feature that will stop teens from seeing inappropriate pictures sent to them by people they are connected with in their direct messages. It was also said by the company that this feature will also work in end-to-end encrypted chats and will “discourage” kids from sending these kinds of pictures.
Meta didn’t say what it is doing to protect kids’ privacy while these features are running. In addition, it didn’t say exactly what it means by “inappropriate.”
Meta added new tools earlier this month that make it harder for teens to see posts on Facebook and Instagram about self-harm or eating problems.
EU regulators sent Meta a written request for information last month. They wanted Meta to give them more information about how it works to stop people from sharing self-created child sexual abuse material (SG-CSAM).
At the same time, the company is being sued in a civil matter in a New Mexico state court. The lawsuit says that Meta’s social network encourages teens to view sexual material and helps predators find accounts of minors. A group of more than 40 US states sued the company in October in a federal court in California, saying that the products they made were bad for kids’ mental health.
Read More: Lowa is Suing Tiktok Again, This Time Saying That They Lied About the Content That Kids Can See
On January 31, 2019, the company will appear before the Senate along with TikTok, Snap, Discord, and X (formerly Twitter) to talk about child safety problems.
What do you say about this story? Visit Parhlo World For more.