Eli Bernstein

370 posts

Eli Bernstein banner
Eli Bernstein

Eli Bernstein

@capitELIst

Principal @ Moto Legal

Australia Katılım Haziran 2009
1.3K Takip Edilen587 Takipçiler
Eli Bernstein retweetledi
Lex Fridman
Lex Fridman@lexfridman·
Here's my conversation with Peter Steinberger (@steipete), creator of OpenClaw, an open-source AI agent that has taken the Internet by storm, with now over 180,000 stars on GitHub. This was a truly mind-blowing, inspiring, and fun conversation! It's here on X in full and is up everywhere else (see comment). Timestamps: 0:00 - Episode highlight 1:30 - Introduction 5:36 - OpenClaw origin story 8:55 - Mind-blowing moment 18:22 - Why OpenClaw went viral 22:19 - Self-modifying AI agent 27:04 - Name-change drama 44:15 - Moltbook saga 52:34 - OpenClaw security concerns 1:01:14 - How to code with AI agents 1:32:09 - Programming setup 1:38:52 - GPT Codex 5.3 vs Claude Opus 4.6 1:47:59 - Best AI agent for programming 2:09:59 - Life story and career advice 2:13:56 - Money and happiness 2:17:49 - Acquisition offers from OpenAI and Meta 2:34:58 - How OpenClaw works 2:46:17 - AI slop 2:52:20 - AI agents will replace 80% of apps 3:00:57 - Will AI replace programmers? 3:12:57 - Future of OpenClaw community
English
503
1.1K
6.8K
2M
Eli Bernstein retweetledi
ConservativeAussie
ConservativeAussie@ConservativeeAu·
BREAKING - Shocking footage. Aussie hero has potentially saved multiple lives. #bondi #shooting
English
742
1.6K
9K
2.1M
Eli Bernstein retweetledi
Financial Services GOP
Financial Services GOP@FinancialCmte·
WATCH: @USRepMikeFlood in support of H.R. 3339: "The Equal Opportunity for All Investors Act of 2025 would expand the [accredited investor] definition to include individuals that are certified through an exam written by the SEC. ...Wealth alone is not a strong judge from those wanting to be an investor or not. ...This will expand opportunity in our capital markets." 📺⬇️
English
25
81
337
17.8K
Eli Bernstein retweetledi
Live Action
Live Action@LiveAction·
Charlie Kirk reminded us that a life of courage and virtue isn’t easy—but it’s worth living. His example should inspire us all to stand boldly for what’s right.
English
2.5K
51.6K
212.2K
4.9M
Eli Bernstein retweetledi
Mario Nawfal
Mario Nawfal@MarioNawfal·
🚨 🇺🇸 MIND-BLOWING FEDERAL BUDGET BREAKDOWN: $6.16 TRILLION SPENDING VS $4.47 TRILLION REVENUE The government's fiscal 2023 numbers reveal massive deficit spending that would make any sane business owner cringe. Social Security dominates spending while individual income taxes carry the revenue load. The deficit gap is staggering — we're spending $1.69 trillion more than we bring in annually. Interest payments on existing debt alone consume hundreds of billions that could fund actual programs. Individual income taxes and payroll taxes shoulder most of the burden while corporate taxes contribute relatively little. Meanwhile, Social Security, defense, and Medicare devour the largest spending chunks. Source: Office of Management and Budget, @therabbithole
Mario Nawfal tweet mediaMario Nawfal tweet media
Elon Musk@elonmusk

💯

English
1K
1.8K
7.7K
4.3M
Eli Bernstein
Eli Bernstein@capitELIst·
This bromance didn’t last long…
English
0
0
0
70
Eli Bernstein retweetledi
Jungle Inc Crypto News
Jungle Inc Crypto News@jungleincxrp·
🚨 RIPPLE SETTLEMENT BLOCKED?! 🔒 Judge Torres just REJECTED the SEC & Ripple’s joint request to finalize their $50M settlement. Both sides AGREED: but the judge said NO. What happened? Here's the breakdown 🧵👇 1. Ripple & the SEC wanted to end the case: – Cut Ripple’s fine from $125M → $50M – Kill the injunction banning Ripple from future violations – Refund the rest to Ripple – Close appeals Simple? Not so fast. 2. The judge denied it. Why? She said they used the wrong legal path. They filed under Rule 62.1 (used when a case is on appeal) But what they were really asking was to vacate a final judgment: and for that, you need Rule 60. 3. This wasn’t just about money. They asked her to erase an injunction she already issued. That’s serious business. Courts don’t just undo final rulings because both sides agree. They need “exceptional circumstances.” Ripple & SEC didn’t even argue that. They didn’t even cite Rule 60. 4. So what now? Ripple has 3 main options: – Refile under the correct rule with a strong justification – Split the request (modify fine, leave injunction) – Proceed with appeal and risk everything This isn’t over: just a misstep. 5. 🔚 Bottom Line: Even mutual deals can fail in court if you don’t follow the right process. Ripple still owes $125M (for now) The injunction stands And the SEC’s case is technically alive Legal chess isn’t over ♟️
English
183
213
968
173.4K
Eli Bernstein retweetledi
Katie Biber
Katie Biber@katiebiber·
1/ Despite a memo released by the DOJ earlier this month, federal prosecutors want to imprison Roman Storm for 45 years for writing open-source code. @matthuang and I provide details in a new post linked below.
Katie Biber tweet media
English
11
84
230
74.3K
Eli Bernstein retweetledi
Matterhorn.so
Matterhorn.so@matterhornso·
1. AI Startups and Developers – Scaling AI Without Cloud Costs Who they are: - Startups building AI-powered applications but struggling with high compute costs - Independent developers working on machine learning models - Companies running AI inference workloads for real-time applications What they need: - Affordable, on-demand GPU compute - Scalable infrastructure without cloud provider lock-in - Privacy-preserving AI training environments How NodeFoundry helps: - Aggregates decentralised compute resourcesat lower costs than traditional cloud services - No need for crypto wallets or tokens, making DePIN as easy as cloud compute - Smart allocation ensures workloads are executed on the most cost-effective and high-performance nodes Use case examples: - Training LLMs (large language models) without relying on AWS, Google Cloud, or Azure - Running computer vision applications on a decentralised GPU network - Deploying AI-powered SaaS tools while cutting infrastructure costs #DePIN #CloudComputing #Web3
Matterhorn.so tweet media
English
12
69
94
2.4K
Eli Bernstein retweetledi
Margo Martin
Margo Martin@MargoMartin47·
JUST NOW! President Trump signs an Executive Order establishing the Strategic Bitcoin Reserve and U.S. Digital Asset Stockpile 🇺🇸
English
1.4K
3.8K
20.1K
4M
Eli Bernstein retweetledi
Wayne Yap
Wayne Yap@wayneyap·
This is Chamath Palihapitiya. • Bought $5M of BTC at $80 each • Took Virgin Galactic & SoFi public • Billionaire VC who scaled Facebook to 1B users. He declared: "Poker is the best training for real life." Here’s his story & his biggest predictions for 2030: 🧵
Wayne Yap tweet media
English
138
429
5.4K
1.5M
Eli Bernstein retweetledi
Andrew Ng
Andrew Ng@AndrewYNg·
The buzz over DeepSeek this week crystallized, for many people, a few important trends that have been happening in plain sight: (i) China is catching up to the U.S. in generative AI, with implications for the AI supply chain. (ii) Open weight models are commoditizing the foundation-model layer, which creates opportunities for application builders. (iii) Scaling up isn’t the only path to AI progress. Despite the massive focus on and hype around processing power, algorithmic innovations are rapidly pushing down training costs. About a week ago, DeepSeek, a company based in China, released DeepSeek-R1, a remarkable model whose performance on benchmarks is comparable to OpenAI’s o1. Further, it was released as an open weight model with a permissive MIT license. At Davos last week, I got a lot of questions about it from non-technical business leaders. And on Monday, the stock market saw a “DeepSeek selloff”: The share prices of Nvidia and a number of other U.S. tech companies plunged. (As of the time of writing, some have recovered somewhat.) Here’s what I think DeepSeek has caused many people to realize: China is catching up to the U.S. in generative AI. When ChatGPT was launched in November 2022, the U.S. was significantly ahead of China in generative AI. Impressions change slowly, and so even recently I heard friends in both the U.S. and China say they thought China was behind. But in reality, this gap has rapidly eroded over the past two years. With models from China such as Qwen (which my teams have used for months), Kimi, InternVL, and DeepSeek, China had clearly been closing the gap, and in areas such as video generation there were already moments where China seemed to be in the lead. I’m thrilled that DeepSeek-R1 was released as an open weight model, with a technical report that shares many details. In contrast, a number of U.S. companies have pushed for regulation to stifle open source by hyping up hypothetical AI dangers such as human extinction. It is now clear that open source/open weight models are a key part of the AI supply chain: Many companies will use them. If the U.S. continues to stymie open source, China will come to dominate this part of the supply chain and many businesses will end up using models that reflect China’s values much more than America’s. Open weight models are commoditizing the foundation-model layer. As I wrote previously, LLM token prices have been falling rapidly, and open weights have contributed to this trend and given developers more choice. OpenAI’s o1 costs $60 per million output tokens; DeepSeek R1 costs $2.19. This nearly 30x difference brought the trend of falling prices to the attention of many people. The business of training foundation models and selling API access is tough. Many companies in this area are still looking for a path to recouping the massive cost of model training. Sequoia’s article “AI’s $600B Question” lays out the challenge well (but, to be clear, I think the foundation model companies are doing great work, and I hope they succeed). In contrast, building applications on top of foundation models presents many great business opportunities. Now that others have spent billions training such models, you can access these models for mere dollars to build customer service chatbots, email summarizers, AI doctors, legal document assistants, and much more. Scaling up isn’t the only path to AI progress. There’s been a lot of hype around scaling up models as a way to drive progress. To be fair, I was an early proponent of scaling up models. A number of companies raised billions of dollars by generating buzz around the narrative that, with more capital, they could (i) scale up and (ii) predictably drive improvements. Consequently, there has been a huge focus on scaling up, as opposed to a more nuanced view that gives due attention to the many different ways we can make progress. Driven in part by the U.S. AI chip embargo, the DeepSeek team had to innovate on many optimizations to run on less-capable H800 GPUs rather than H100s, leading ultimately to a model trained (omitting research costs) for under $6M of compute. It remains to be seen if this will actually reduce demand for compute. Sometimes making each unit of a good cheaper can result in more dollars in total going to buy that good. I think the demand for intelligence and compute has practically no ceiling over the long term, so I remain bullish that humanity will use more intelligence even as it gets cheaper. I saw many different interpretations of DeepSeek’s progress here in X, as if it was a Rorschach test that allowed many people to project their own meaning onto it. I think DeepSeek-R1 has geopolitical implications that are yet to be worked out. And it’s also great for AI application builders. My team has already been brainstorming ideas that are newly possible only because we have easy access to an open advanced reasoning model. This continues to be a great time to build! [Original text: deeplearning.ai/the-batch/issu… ]
English
282
1K
4.3K
616.4K
Eli Bernstein retweetledi
_gabrielShapir0
_gabrielShapir0@lex_node·
TRUMP IS AN ETH MAXI PLAYING 4D CHESS BY DDoSING SOLANA
English
20
12
181
26.2K
Eli Bernstein retweetledi
Elon Musk
Elon Musk@elonmusk·
😂
Elon Musk tweet media
QME
8.9K
11.9K
118.8K
21.8M