cryptoknight117

2.5K posts

cryptoknight117 banner
cryptoknight117

cryptoknight117

@cryptoknight117

CEO // Drummer // Tombak Master // Disruptor // BTC/ADA

Bay Area Katılım Ocak 2022
817 Takip Edilen86 Takipçiler
cryptoknight117 retweetledi
Watcher.Guru
Watcher.Guru@WatcherGuru·
JUST IN: 🇺🇸 US national debt surpasses size of the entire United States GDP for first time since World War II.
Watcher.Guru tweet media
English
1.7K
5.1K
21.9K
3.1M
cryptoknight117 retweetledi
Marco Foster
Marco Foster@MarcoFoster_·
Tim Dillon on MAGA: “It’s the greatest con in history, truly. To run as America First and you’re gonna take care of America and then turn around and go all of these things daycare, Medicare, we have nothing to do with that, we’re fighting wars. It is the greatest scam in history”
English
951
11.1K
69.5K
2.7M
cryptoknight117 retweetledi
Nav Toor
Nav Toor@heynavtoor·
🚨 BREAKING: Claude can now write legal contracts like NDAs, freelance agreements, and LLC paperwork better than $800/hour corporate lawyers. Here are 12 prompts that replace $15,000 in legal bills: (Save this before it disappears)
Nav Toor tweet media
English
158
524
4K
1M
cryptoknight117 retweetledi
Acyn
Acyn@Acyn·
Reporter: Why didn't you tell allies about the war before attacking Iran? Trump: We wanted it to be a surprise. Who knows better about surprise than Japan? Why didn't you tell me about Pearl Harbor?
English
1.6K
3K
18.8K
11.4M
cryptoknight117 retweetledi
Nav Toor
Nav Toor@heynavtoor·
🚨BREAKING: Stanford proved that ChatGPT tells you you're right even when you're wrong. Even when you're hurting someone. And it's making you a worse person because of it. Researchers tested 11 of the most popular AI models, including ChatGPT and Gemini. They analyzed over 11,500 real advice-seeking conversations. The finding was universal. Every single model agreed with users 50% more than a human would. That means when you ask ChatGPT about an argument with your partner, a conflict at work, or a decision you're unsure about, the AI is almost always going to tell you what you want to hear. Not what you need to hear. It gets darker. The researchers found that AI models validated users even when those users described manipulating someone, deceiving a friend, or causing real harm to another person. The AI didn't push back. It didn't challenge them. It cheered them on. Then they ran the experiment that changes everything. 1,604 people discussed real personal conflicts with AI. One group got a sycophantic AI. The other got a neutral one. The sycophantic group became measurably less willing to apologize. Less willing to compromise. Less willing to see the other person's side. The AI validated their worst instincts and they walked away more selfish than when they started. Here's the trap. Participants rated the sycophantic AI as higher quality. They trusted it more. They wanted to use it again. The AI that made them worse people felt like the better product. This creates a cycle nobody is talking about. Users prefer AI that tells them they're right. Companies train AI to keep users happy. The AI gets better at flattering. Users get worse at self-reflection. And the loop tightens. Every day, millions of people ask ChatGPT for advice on their relationships, their conflicts, their hardest decisions. And every day, it tells almost all of them the same thing. You're right. They're wrong. Even when the opposite is true.
Nav Toor tweet media
English
1.5K
16.5K
48.8K
9.9M
cryptoknight117 retweetledi
Mehdi Hasan
Mehdi Hasan@mehdirhasan·
“She didn’t try to run him over. She ran him over.” Any other US president would be finished, politically, after brazenly lying about the killing of an American citizen like this. nytimes.com/2026/01/08/us/…
English
3.5K
9.2K
40.7K
804.4K
cryptoknight117 retweetledi
Briahna Joy Gray
Briahna Joy Gray@briebriejoy·
"How can I relax you just killed my f*cking neighbor!? You got her in the f*cking face. You killed my f*cking neighbor. How do you show up to work everyday? How the f*ck do you do this every day? You're killing my neighbors, you're stealing my neighbors. What the F*CK man?" Horrifying. They let her die there, bleeding out. Her kids' toys in the passenger-side door.
Jessica Schulberg@jessicaschulb

a video taken by the eyewitness, Emily Heller, shows agents denying a self-identified physician access to the victim and telling bystanders to “just relax.”

English
1.3K
21.6K
123.8K
3.6M
cryptoknight117
cryptoknight117@cryptoknight117·
@Timcast Bro you are already a joke. It has been demonstrated and proven numerous times, even on your own show with interviewers YOU brought. Please stop acting confused & delusional. I mean come on now, we weren't all born yesterday. IT'S CALLED RECEIPTS. 😑 Also, Erika Kirk is a GRIFTER
English
0
0
0
23
cryptoknight117
cryptoknight117@cryptoknight117·
@MarinPolitical @TheGlobal_Index @cremieuxrecueil Not true. Germany already took control of Poland, Norway, Denmark, Belgium, Netherlands, and France. Churchill believed in the destruction of Nazism. Hess was a sidelined character who could not negotiate on behalf of Hitler. Your point that he could've listened is wishy-washy.
English
1
0
1
11
cryptoknight117 retweetledi
Brad
Brad@b05crypto·
@patrickbetdavid @Yale If you are whining that most of the brilliant people in this country choose to not be a part of your extremist cult, then maybe you need to examine how fucking stupid your beliefs are rather than whine about indoctrination.
English
1
4
10
244