Instagram has announced a change for new users under the age of 16 that will automatically set new accounts by default to receive the least sensitive type of content that Instagram can deliver. Existing account holders under the age of 16 will also receive prompts suggesting they review their content options in the same way.
The latest change is an update to the existing Sensitive Content Control feature, launched last year to give users content options based on three degrees of sensitivity: 'less,' 'standard,' or 'more.' The types of content Instagram sees as 'sensitive,' or potentially harmful and inappropriate include posts with depictions of violence, sexually explicit or suggestive posts, or content promoting regulated products and substances.
It's all in an effort for teens to basically have a safer search experience, to not see so much sensitive content and to automatically see less than any adult would on the platform. We're nudging teens to choose 'Less,' but if they feel like they can handle the 'Standard' then they can do that.
Jeanne Moran, Policy Communications Manager, Meta
Now, under-16s joining Instagram will have their accounts set to 'less' as default. Regular push notifications will be used to encourage existing younger Instagram account holders to opt in to the heavier filtering of what Instagram's algorithm shows them across Search, Explore, Hashtag Pages, Reels, Feed Recommendations, and Suggested Accounts.
Cybersmile welcomes any change that can help young people manage their social media experiences in a healthy way. It is crucial that social media platforms continue to progress in this direction and provide younger users with the awareness, knowledge, and tools needed to maintain their digital wellbeing.
Dan Raisbeck, Co-founder, The Cybersmile Foundation
The recent changes are the latest in a long line of new features, policy amendments, and community tools that Instagram has created in an effort to provide more safety options for younger users. These include prevention-based solutions such as parental controls that parents can use to protect children from interacting with unknown adult users or being exposed to adult content.