Deviizer

10 posts

Deviizer

Deviizer

@deviizer

Katılım Ağustos 2020
218 Takip Edilen144 Takipçiler
Deviizer retweetledi
Zenuux
Zenuux@Zenuuux·
While the rollout of AI-based age estimation is presented as a "step toward safety", it risks causing serious (unintended) consequences for Roblox's social experiences. Many games exist primarily as social hangouts rather than gameplay-focused experiences, and restricting chat by age could fundamentally alter these spaces. It would make sense to prevent users from becoming friends with people outside their age group, but entirely locking chat for unverified users is a far more drastic step. This approach will discourage users from participating at all - especially children, who are unlikely to want to scan their faces on the platform. Limiting chat in this way risks driving users off Roblox to third-party apps or social media, where safety and moderation are far less robust and it could undermine daily engagement to the point where the 150 million users Roblox cites may no longer log in every day. At the same time, restricting in-experience chat reduces the visibility developers and moderation staff have into interactions within their own games, limiting their ability to detect and respond to harmful behavior. Additionally, if minors cannot see messages from older users, even in public channels, it fragments the experience and isolates younger players without addressing predatory intent - simply filtering by age appearance rather than behavior. Overall, while the initiative may reduce some risks, it prioritizes limiting interactions over effectively tackling harmful behavior, and it could significantly harm engagement, developer oversight, and Roblox's stock.
David Baszucki@DavidBaszucki

I wanted to share a few thoughts as we start to roll out age estimation… Over 150 million users come to Roblox every day. They meet with friends, play, create, and often get motivated to study STEM or art or start a business. This represents a massive responsibility that we think about every day. Ever since we started, we have asked employees to think about the individual user, and the role Roblox plays in their lives. This includes the creativity they can unlock, the support they find, or the joy a game can bring to them. Our platform's scale has grown over two decades. From the very beginning, we knew safety needed to be foundational. Erik built the first moderation system on Roblox less than a month after we launched and the four of us took turns daily acting as the moderators. From the start, we built Roblox with users of all ages in mind. And as our company has grown, we’ve continued to innovate by using the latest technological innovations to enhance safety. Today, we are starting to roll out proactive age checks for anyone using our platform's chat features. We see this as part of our ongoing commitment to help define the future of safety and civility on the internet. Parents are becoming more aware that social media and messaging apps are widely used by children, but often designed to be appropriate only for users over 13 and make it easy to bypass age gates. We designed Roblox from the start with users of all ages in mind. We don’t allow image sharing, we aggressively filter chat and monitor for critical harms, and we have restrictive content standards based on user age. And, we are the largest online platform that acknowledges the presence of users under the age of 13. Over the decades, we have innovated both our technology and policies to address this challenge directly. For example, we recently open-sourced our model for detecting personally identifiable information (PII), which complements our other open-source models for early detection of grooming, voice moderation, and AI prompt safety. Today’s rollout is one more step in our continuing innovation. Our age-check system uses AI and a device’s camera to estimate a user's age and, based on this information, we help limit minors to only communicating with users in their peer group. This is part of our commitment to defining the “Gold Standard” for safety and civility on the internet. We all share the same goal: a safe internet for kids. And yet today, there isn’t a shared definition of safety. We believe the entire industry—social media, user-generated content platforms, messaging, and gaming—must take a unified approach to this challenge in collaboration with policymakers. There’s ample opportunity to work together. We were the first platform to join the Attorney General Alliance Partnership for Youth Online Safety and are collaborating with the International Age Rating Coalition to bring modern rating protocols to user-generated content platforms. These steps are significant, but we recognize this is merely laying the groundwork for what’s required next. We are fundamentally optimistic about the future. And on safety, I am particularly energized by what’s possible. With a blend of innovation and collaboration, including smart legislation, we can continuously make progress. To create a safe internet for kids, we need to protect and enhance the significant benefits the internet offers young people—connecting, creating, and learning. Achieving this means we must implement robust, industry-wide solutions that preserve these opportunities while effectively minimizing risks. Link to Newsroom post: corp.roblox.com/newsroom/2025/…

English
28
57
342
27.3K
Deviizer retweetledi
Owen
Owen@LegendOJ1_RBLX·
This is a dangerous move that will more than likely kill social roleplay games on the platform. Creating a space for people of all ages to have fun roleplaying as a sushi chef, barista or car washer is something many devs have done successfully for years. I've built a community of over 2 million people who have fun and socialize in my games. We have over 100 moderators who keep our game safe and enjoyable for people of all ages. Alienating an entire community is not something that has gone well for other devs in the past and is not what I intend to do to mine. It is wrong for Roblox to class themselves as a social platform then backtrack on that when they realize they don't have the resources to handle illegal activities on the platform. Roblox's handling of accusations and overall transparency about such problems is very poor. Roblox has fucked roleplay devs over and over with "safety" changes over the past few months. Now the devs who create positive spaces that rely on communication will more than likely lose that source of income. Seems like the nail in the coffin for social roleplay experiences.
Roblox@Roblox

Today, we’re starting to roll out age checks to unlock chat, to help keep Roblox fun and safe for everyone. To learn more, go to corp.roblox.com/newsroom/2025/…

English
37
253
2K
71.9K
Deviizer
Deviizer@deviizer·
What I've been building recently
Deviizer tweet media
English
3
0
43
2.2K
Deviizer
Deviizer@deviizer·
Interior from a game I've been working on
Deviizer tweet media
English
4
1
59
2K
Deviizer
Deviizer@deviizer·
Thumbnail personalization is crazy
Deviizer tweet media
English
3
0
8
655
Deviizer
Deviizer@deviizer·
Setting records
Deviizer tweet media
English
14
1
53
7.1K