Omer Yarkowich ⵙ Positive Constraint
2.5K posts

Omer Yarkowich ⵙ Positive Constraint
@omeryarkowich
לשם שמים ואסטרטגיה. עוד מאמין בעם ישראל. בעלול לפיצקה. אב לבלונדיני וחצי תימניה עקשנית. מרפא בג׳חנון.



Game of Thrones first aired 15 years ago today. With the episode ‘Winter is coming’











@eranshir מציע לך להאזין לפודקאסט המרקרים עם איציק בן ישראל, מגדולי מומחי הבטחון שהיו לנו. הוא אומר דברים דומים מאד. הוא בודאי לא בן אדם שהמערכת דחתה.


Hormuz is a weapon that can only be fired once No one should expect a quick resolution to the current crisis, but across the next decade, even the next 3-5 years, the choke point of Hormuz will be massively substituted for The Gulf Arab states are all very rich, with high per capita GDP - the best single measure of relative state capacity - easy access to global markets, especially financial, and have the favorable backing of the US Everyone has known about the Hormuz vulnerability for decades. The Iranians have continually hinted around closing it, but never did. Now they have, but Hormuz is a gun that cannot be reloaded. Deterrents work only up to the point of use. Once used, they have failed. The purpose of a deterrent is to *not* be used Many analysts have made this basic mistake. They think that Iran is now in a position of strength, having exercised its Hormuz option. But the opposite is true. A state is weakest after it has used its deterrent. The cost of that deterrence is now priced in. The worst having been done, the targets of the deterrent are now free to make other arrangements. Before, they were reluctant to do so because of the switching costs. Now, they have no choice; they will not allow themselves to be controlled in this way again Hormuz may never reopen. But the importance of this is a depreciating asset.





The CEO of one of America's most powerful defense AI companies just said something no one in Silicon Valley wants to hear. He said if you don't see what's coming, you're blind. His name is Alex Karp and he runs Palantir. His company builds AI for the Pentagon, the CIA and every branch of the US military. He's describing what's already happening behind closed doors. Here's what he said: If Silicon Valley takes away every white collar job in America, the lawyers, the analysts, the consultants, the coders and at the same time arms the military with that same AI. There is only one outcome. Nationalization. The government takes your technology. Karp says a "horseshoe effect" is forming and the far left and the far right don't agree on anything except one thing. Tech is not paying the bills and the industry should be seized. Look at what just happened. The Pentagon told Anthropic, let us use your AI for anything we want, no restrictions. Anthropic said no. The government designated them a national security threat. Trump ordered every federal agency to stop using their AI. The Defense Production Act, a Korean War era law was on the table. Hours later, OpenAI signed a deal with the Pentagon. The message to every AI company in America: fall in line, or get replaced. Now combine that with this, Anthropic's own CEO says AI could wipe out 50% of entry level white collar jobs. Unemployment could hit 20%. Not factory workers or truck driver but the people with degrees. 71% of Americans already say they're worried AI will permanently take their jobs. Steve Bannon says AI job loss will be THE issue of the 2028 election. And Karp, the man who builds this technology for the military is telling his own industry: if you don't make AI work for regular people, the state will take everything you've built. This political bomb is already ticking.













8 סוכני ICE לא מסוגלים לעצור ״חשוד״ אז הם יורים בו בראש ארה״ב של אמריקה







