๐”ค๐”ฏ๐”ฒ๐”ช๐”ญ๐”ข๐”ฉ๐”ฉ๐”ž

108.9K posts

๐”ค๐”ฏ๐”ฒ๐”ช๐”ญ๐”ข๐”ฉ๐”ฉ๐”ž banner
๐”ค๐”ฏ๐”ฒ๐”ช๐”ญ๐”ข๐”ฉ๐”ฉ๐”ž

๐”ค๐”ฏ๐”ฒ๐”ช๐”ญ๐”ข๐”ฉ๐”ฉ๐”ž

@pinchemom

my truths one tweet at a time // ๐š๐š›๐š˜๐š  ๐šŠ๐š ๐šŠ๐šข if you donโ€™t like my mind // ๐” ๐”ฅ๐”ž๐”ฌ๐”ฑ๐”ฆ๐”  ๐”ซ๐”ข๐”ฒ๐”ฑ๐”ฏ๐”ž๐”ฉ โ™‘๏ธŽโ™ก ๐ŸŽŸ๐ŸŒฑ๐Ÿงš๐Ÿผโ€โ™‚๏ธ๐Ÿชด๐Ÿทโœจ ๐‘€๐“‡๐“ˆ๐Ÿ’4/20 + ๐‘€๐‘œ๐“‚๐Ÿ‘ฆ๐Ÿผ5/10

Hellafornia Katฤฑlฤฑm Nisan 2017
285 Takip Edilen4.3K Takipรงiler
๐”ค๐”ฏ๐”ฒ๐”ช๐”ญ๐”ข๐”ฉ๐”ฉ๐”ž retweetledi
Aakash Gupta
Aakash Gupta@aakashguptaยท
Everyoneโ€™s missing the real story here. Metaโ€™s Ray-Ban glasses need human data annotators to train the AI. When you say โ€œHey Metaโ€ and ask the glasses to analyze something, that video gets sent to Metaโ€™s servers, then routed to Sama, a subcontractor in Nairobi, Kenya. Workers there manually label objects in your footage. They see everything you recorded, intentionally or not. 7 million pairs sold in 2025 alone. Every single pair generates training data that flows through human eyes in Kenya. Workers told Swedish journalists they see people undressing, using bathrooms, having sex, and accidentally filming bank card details. One worker said โ€œwe see everything, from living rooms to naked bodies.โ€ Metaโ€™s automatic face anonymization is supposed to protect people in the footage. Workers say it fails in certain lighting. Faces that should be blurred are sometimes fully visible. The person you recorded without knowing? A stranger in Nairobi can identify them. Buried in Metaโ€™s terms of service is one sentence doing enormous legal work: the company reserves the right to conduct โ€œmanual (human) reviewโ€ of your AI interactions. Thatโ€™s the legal cover for routing intimate footage from Western homes to a $2/hour labor force operating under NDAs, office surveillance cameras, and a strict no-questions policy. Workers say if you raise concerns about what youโ€™re seeing, youโ€™re fired. This is the same company, Sama, that TIME exposed in 2023 for paying Kenyan workers $2/hour to label graphic content for OpenAI while being billed at $12.50/hour per worker. Workers described the experience as torture. Sama ended that contract, then pivoted to labeling Metaโ€™s glasses footage. Same workforce. Same rates. Meta markets these glasses as โ€œdesigned with your privacy in mind.โ€ The privacy design is a tiny LED light on the frame that most people donโ€™t notice. The data pipeline behind it routes your bedroom footage to a contractor with a documented history of worker exploitation, failed anonymization, and union-busting lawsuits. And the next generation of these glasses? Meta is planning to add facial recognition. The same system that canโ€™t reliably blur faces in training data wants to start identifying them on purpose. The LED light on the frame is doing about as much for your privacy as the terms of service nobody reads.
Shibetoshi Nakamoto@BillyM2k

why the fuck meta employees watching videos their users are taking

English
444
15K
48.1K
4.8M
๐”ค๐”ฏ๐”ฒ๐”ช๐”ญ๐”ข๐”ฉ๐”ฉ๐”ž retweetledi
๐”ค๐”ฏ๐”ฒ๐”ช๐”ญ๐”ข๐”ฉ๐”ฉ๐”ž retweetledi
Mountain Cabins
Mountain Cabins@cabinsmountainยท
This is all I want. The idea of the little internal courtyard, loved it everywhere I've seen it
Mountain Cabins tweet media
English
218
4.5K
39.2K
3.5M