In today's digital age, children's online safety issues are becoming increasingly important. Recently, Roblox, Discord, OpenAI and Google jointly launched a non-profit organization called ROOST (Strong Open Online Security Tool), aiming to build a scalable and interconnected security infrastructure for the AI era. The new organization aims to provide open source security tools for public and private institutions to help them strengthen protection on their own platforms, with a particular focus on children’s safety.
Image source notes: The image is generated by AI, and the image authorized service provider Midjourney
ROOST was founded in response to the rapid development of generative AI. As the network environment changes, children face increasing risks, so the need for a “reliable and accessible security infrastructure” has become more urgent. ROOST hopes to avoid small companies or organizations from scratch when developing these tools by providing ready-made security tools, but instead can directly use these free solutions.
In ROOST's program, organizations will focus on providing tools for detecting, reviewing and reporting child sexual abuse materials (CSAM). This move will help various platforms identify and process improper content and protect children's safety in the network environment. To this end, the participating companies will not only provide financial support, but will also contribute corresponding technical strength.
Children’s online safety issues have been receiving attention, especially during the time the Children and Youth Online Privacy Protection Act and the Children’s Online Safety Act were under consideration in Congress. Although the bills failed to pass the House vote, participating tech companies, including Google and OpenAI, have committed to stop using AI technology to generate child sexual abuse material.
For Roblox, child safety issues are particularly important. According to 2020 data, two-thirds of children aged nine to twelve in the United States use Roblox. The platform has faced many challenges in child safety. As early as 2024, Bloomberg Business Week reported that the company was suffering from "pedophile problems", so Roblox had to strengthen restrictions on children's private messages and the implementation of new policies.
Although the launch of ROOST cannot solve all problems, it provides a simpler response for Roblox and other similar platforms, making efforts to ensure children's safety in the AI era.