Recently, the editor of Downcodes learned that Meta is actively using artificial intelligence technology to identify the true age of Instagram users, especially those who claim to be teenagers. The move aims to strengthen the platform's protective measures for teenage users and respond to public concerns about the impact of social media on teenagers' mental health. Meta launched a new account in September designed specifically for teenagers aged 13 to 17, which has a number of restrictions to ensure the safety and privacy of teenage users.
These teen accounts have many built-in restrictions, such as who can contact them, what content can be seen, and more. In addition, potentially offensive comments and message requests are automatically hidden. The problem is, these secure accounts are only effective if teenagers use them voluntarily or honestly fill out their age. Therefore, Meta came up with a way to ensure correct account allocation. They have developed a proprietary software tool called the Adult Classifier, which they plan to launch next year, and is designed to classify users into two groups: those over 18 years old and those under 18 years old.
According to Alison Hartnett, Meta's director of product management, the tool will scan a user's profile, the content they interact with, and their follower list to determine their true age. Even a seemingly innocuous “Happy Birthday” message can be a clue to a user’s age.
According to Meta's own 2019 research, social media apps like Instagram can have a negative impact on teenagers' mental health. To combat this, Meta plans to begin migrating teens who volunteer their age information to new teen accounts as soon as possible and roll out this "adult classifier" early next year.
The measure is Meta's latest attempt to repair public concerns about its platform's impact on youth mental health. Back in 2021, a Wall Street Journal report revealed Meta’s internal reporting showing that it understood Instagram was harmful to the mental health of teenagers, especially girls. “We are making body image issues more painful for one in three teenage girls,” reads a slide from Meta’s 2019 internal report.
Meta’s move reflects its growing concern for the safety and physical and mental health of teenage users, but the accuracy and privacy protection measures of its “adult classifier” still require further observation and consideration. In the future, it is worth looking forward to whether Meta can effectively balance user privacy and platform security.