Introduction:
Meta Platforms, the parent company of Instagram and Facebook, is implementing additional protections to ensure the safety of teenage users by preventing unwanted direct messages. This move comes in response to recent commitments made by Meta to enhance content visibility controls for teenagers, following regulatory pressure to safeguard young users from potentially harmful content.
![]() |
(Image: Google) |
Ensuring Teen Safety:
In a proactive response to regulatory concerns and user well-being, Meta Platforms is taking steps to fortify safety features. Specifically, on Instagram, teens will now have default settings that prevent them from receiving direct messages unless from individuals they follow or are connected to. Additionally, any alterations to certain app settings will require parental approval, adding an extra layer of protection.
Messenger Safety Measures:
On Facebook's Messenger platform, users under 16 (and under 18 in select countries) will experience heightened safety measures. These users will only receive messages from Facebook friends or individuals within their phone contacts, reinforcing a controlled communication environment.
Context:
This initiative follows increased regulatory scrutiny, fueled in part by a former Meta employee's testimony in the U.S. Senate. The employee raised concerns about the company's awareness of harassment and potential harm to teenagers on its platforms. Meta's response demonstrates its commitment to addressing these concerns and proactively implementing measures to enhance the safety of teen users.
Conclusion:
Meta Platforms' latest efforts underline its commitment to creating a secure online space for teenage users. By refining messaging controls and requiring parental approval for certain settings, Meta aims to prioritize the well-being of young individuals on its platforms.