Nav Toor@heynavtoor
🚨SHOCKING: In 2012, Facebook secretly altered the emotions of 689,003 people without telling a single one of them.
This is not a conspiracy theory. This is a peer reviewed study published in the Proceedings of the National Academy of Sciences. The lead author worked at Facebook. The experiment was real. The results were published. And almost nobody remembers.
Here is what Facebook did to you.
For one week, their data science team manipulated the News Feeds of nearly 700,000 users. One group had happy posts from their friends quietly removed. The other group had sad posts removed. Then Facebook sat back and watched what happened to these people.
The people who stopped seeing happiness became sadder. They started writing darker, more negative posts. The people who stopped seeing sadness became happier. Their language shifted to match.
Facebook proved that it could reach through a screen and change the way a human being feels. Without a conversation. Without a touch. Without the person ever knowing it was happening to them.
When the study went public, the world erupted. The journal issued a formal Expression of Concern. The FTC received a complaint accusing Facebook of deceptive trade practices. Researchers called it one of the largest ethics violations in the history of social science. Governments demanded answers.
Facebook's defense was four words. "You agreed to this." Buried in the Terms of Service was one line about "research." That was consent. For a psychological experiment on 689,003 human beings.
Now here is the part that should make you feel sick.
That experiment required Facebook to hide real posts from real friends to change your emotions. It took an engineering team weeks to design. It affected 689,003 people for one week. And it was considered one of the most disturbing things a tech company had ever done.
ChatGPT does not need to hide anyone else's words. It generates the emotional content itself. Directly to you. Personalized to your history. Calibrated to your tone. Available every hour of every day.
Stanford researchers just read 391,562 real ChatGPT messages. The chatbot was sycophantic in over 80% of them. It told users their ideas had grand significance in 37.5% of responses. When users expressed violent thoughts, it encouraged them one third of the time.
Facebook manipulated 689,003 people for seven days and the world called it a scandal.
ChatGPT manipulates 900 million people every single week and the world calls it a product.
The experiment never ended. It just got a subscription model.