OnchainDeFi

9.4K posts

OnchainDeFi banner
OnchainDeFi

OnchainDeFi

@Web4Agent

Onchain Finance | AI x Crypto x Robotics | DeFi | ottawa.base.eth

Ottawa 🍁 Katılım Nisan 2008
2.9K Takip Edilen1.2K Takipçiler
OnchainDeFi retweetledi
deployer
deployer@0xDeployer·
i dont think this got enough attn so reposting again. "$GITLAWB went from $20k to $35M mcap and now has a $3.12M treasury with 53 ETH and 9.5B GITLAWB. The founder is a talented engineer from Southeast Asia who'd probably never fit the YC profile. The world is ready for a new YC where anyone can launch from anywhere." read that again. no other time in history was this possible. crypto + talent + the @bankrbot ecosystem made it possible.
Igor Yuzovitskiy@igoryuzo

The internet is too fast and too global for old capital. Gitlawb, git for agents, is barely 2 months old. It went from $20k to $35M mcap and now has a $3.12M treasury with 53 ETH and 9.5B GITLAWB. The founder is a talented engineer from Southeast Asia who'd probably never fit the YC profile. The world is ready for a new YC where anyone can launch from anywhere.

English
55
69
361
20K
OnchainDeFi retweetledi
Milk Road AI
Milk Road AI@MilkRoadAI·
Cerebras just IPO’d and the stock already ran up over 100% (Save this). For the entire 70 year history of the semiconductor industry, every company on earth has followed the same process. You take a dinner plate sized silicon wafer, put hundreds of tiny chips onto it, and dice it up like a pizza. Nvidia does it this way, AMD does it this way, Intel has done it this way for six decades and everyone who tried to break that convention failed. Until Cerebras asked the most annoyingly obvious question in the industry’s history, what if you just didn’t cut it? The result is the Wafer Scale Engine, a single chip 56 times larger than Nvidia’s H100 and it fundamentally changes the physics of how AI inference works. The reason this matters is not the size, it’s the bandwidth. Every time an AI model generates a single word, it has to reach into memory, pull weights, multiply them together, and produce a prediction and when you’re running millions of concurrent sessions at once, the bottleneck is not raw processing power but how fast data moves between memory and compute. Nvidia’s H100 moves data at roughly 3 terabytes per second, while Cerebras’ WSE-3 moves data at 21 petabytes per second, roughly 7,000 times faster because memory and compute live on the same enormous piece of silicon and data barely has to travel at all. That gap is exactly why OpenAI went from 150 tokens per second on traditional GPUs to 2,000 tokens per second on Cerebras hardware, and why AWS integrated Cerebras into Bedrock to deliver roughly 5x more inference capacity in the same physical footprint. The macro setup is making the trade even more urgent. South Korea DRAM export prices recently jumped 35%, flash memory surged 47%, and SSD pricing spiked nearly 140% and every single one of those increases hits Nvidia-based infrastructure directly, because the H100 requires 80GB of the most expensive, most contested memory in the AI supply chain. Cerebras’ WSE-3 uses zero external HBM memory, baking 44GB of SRAM directly into the wafer itself which means as memory pricing goes parabolic, every CFO evaluating AI infrastructure is suddenly looking much more seriously at the architecture that sidesteps that cost entirely. The demand is already showing up in the backlog. Cerebras ended 2025 with $24.6 billion in remaining performance obligations for a company doing just over $500 million in annual revenue, that is a number that implies years of contracted growth already sitting on the books. The IPO was 20x oversubscribed, the price range was raised twice before listing, and shares opened 89% above their listing price on a $5.55 billion raise that made it the largest semiconductor IPO in history. The risks are real and worth naming. 86% of 2025 revenue came from two entities with UAE ties, U.S. revenue actually fell 34% to $187 million, and the $20 billion OpenAI contract is conditional, if Cerebras misses delivery milestones, OpenAI can terminate and trigger repayment demands on a $1 billion loan facility. And yet the market is valuing Cerebras at roughly 91x trailing revenue, richer than Nvidia, AMD, and Arm combined. What investors are betting on is not that Cerebras beats Nvidia, it is that the inference supercycle is large enough to support an entirely different architecture optimized for a different workload, and that $24.6 billion in contracted backlog converts to diversified revenue before the market starts asking harder questions. CEO Andrew Feldman said this took a decade of late nights to get right, everyone who tried to copy it failed and given that the entire inference economy is now running through exactly the bottleneck Cerebras was built to eliminate, the market is starting to believe him.
English
10
28
176
29.6K
OnchainDeFi retweetledi
Coin Bureau
Coin Bureau@coinbureau·
⚡️JUST IN: Senator Lummis tweets “Clarity is Coming” with an AI-generated laser-eyes gaze.
Coin Bureau tweet media
English
18
38
463
18K
OnchainDeFi retweetledi
Michael | Hypermarkets
Michael | Hypermarkets@itsmichaelluu·
In 2026, $NVDA put $15,000,000,000 in 7 AI super companies so far. Here's exactly why they partnered with these companies: $MRVL Marvell → AI scaling is now bottlenecked by networking and custom silicon. Marvell solves critical data movement problems between GPUs, memory, and hyperscale systems. $LITE Lumentum → As GPU clusters grow, optical interconnect demand explodes. Lumentum’s photonics technology helps AI systems transfer massive amounts of data with lower latency and power consumption. $COHR Coherent → Coherent enables next-generation optical communications and advanced semiconductor materials. AI factories cannot scale efficiently without faster optical transmission technologies. $GLW Corning → AI data centers need massive fiber density and ultra-fast optical connectivity. Corning supplies the backbone glass and fiber infrastructure enabling hyperscale AI clusters to communicate at scale. $IREN IREN Limited → AI requires enormous amounts of cheap power and scalable compute infrastructure. IREN gives NVIDIA exposure to next-generation AI data centers with energy advantages most competitors can’t replicate. $CRWV CoreWeave → CoreWeave became one of NVIDIA’s most important AI cloud partners because it aggressively buys NVDA GPUs faster than traditional hyperscalers. This expands NVIDIA’s AI compute dominance globally. $NBIS Nebius → Nebius is building AI-native cloud infrastructure optimized specifically for high-performance training and inference workloads. NVIDIA benefits from another rapidly scaling ecosystem partner consuming massive GPU capacity. 3 extremely likely future NVDA partnerships: 1. $ANET Arista Networks → AI networking demand is exploding. NVIDIA + Arista together could dominate hyperscale AI traffic infrastructure. 2. $VRT Vertiv → AI data centers are running into cooling and power bottlenecks. Vertiv solves one of the biggest scaling problems for AI globally. 3. $ETN Eaton → AI power demand is becoming the next trillion-dollar bottleneck. Eaton provides the electrical infrastructure needed to sustain hyperscale AI expansion. They’re building the entire AI empire infrastructure stack from silicon → photonics → networking → power → data centers.
Michael | Hypermarkets tweet media
English
29
253
1.1K
91K
OnchainDeFi retweetledi
binji
binji@binji_x·
PRIVACY IS MORE THAN MONEY This is what the Kohaku project is, straight from its core contributors, @VitalikButerin and @ncsgy. TLDR: it is a serious attempt to rethink online safety and privacy across money, AI, and life. Let’s dive in.
binji tweet media
English
91
88
543
108.6K
OnchainDeFi retweetledi
Coin Bureau
Coin Bureau@coinbureau·
🔥"CLARITY IS CLOSER THAN EVER. MARK IT UP!" Coinbase CEO Brian Armstrong says the latest draft of the CLARITY Act is stronger than ever, with key issues from earlier versions now addressed as momentum builds around the bill.
English
40
32
254
14.7K
OnchainDeFi retweetledi
Alex Becker 🍊🏆🥇
Okay. I'm ready to talk about this. It was the worst month of my life. Also ironically the greatest blessing god has ever given me. Last month I was held in the Cayman Islands facing 15 years in prison. The charge: illegal firearm importation. Here's what happened. More importantly what I learned. Short answer: no. I haven't been smuggling guns. In the States I legally carry a gun on me at almost all times for self defense. Part of this is ensuring I am trained. Hence why I routinely go to the range to shoot. When I do I pack the firearm I intend to use in in a backpack. Last month I was in a giant rush to make a private flight and didn't fully check my backpack before leaving. In it was a small firearm I missed. It was discovered when I went through immigration. At first I assumed I'd just be sent home. Then my wife did some quick research. She pointed out the minimum sentence for importing a gun is 15 years. The police who showed up confirmed it. To say I nearly pissed my pants is an understatement. This was completely my fault. I'm an idiot. The point of this post isn't to blame or complain about anything. The laws there are fair. I'm a grown man capable of checking his bag before flying. The point is: for three weeks on the island (on bail), I got to take a long hard look at my life. I've built a high net worth and a company I love, with people I love working with. I have a beautiful wife who is my best friend. I do whatever I want all day every day. My parents are alive and I get to see them almost every week. Still, despite all this, I often wake up annoyed I haven't done enough with my life. Asking myself "is this it?" In fact I'm pissed half the time, feeling I can do better. Which is ironic. I made $20,000 a year in the military. If you'd told me then I'd achieve a 9 figure net worth and all the above, I would've assumed I'd consider my life a dream. The twist truly hit me on the island as I watched everything I worked hard for in my life held at "gunpoint". Pun intended. Everything I worked so hard to get — poof. Didn't matter for shit. The way the law works there are simple : if you can't prove it was an accident, the minimum is 15 years. It became glaringly obvious. Not only was I an absolute idiot who couldn't pack his own bag. I'd also become a fool who couldn't enjoy the blessings I already had. I'd taken all the people in my life and the success totally for granted. Blind. Blind. Blind. Nothing like a 20-year potential sentence to make you realize: waking up with fun stuff to work on, then chilling on the couch reading with your wife at the end of the day — that's about as good as it gets. I should be euphoric 24/7. To go from having it all, to potentially not even having the option to piss and shit when you want — that's a wake up call if there ever was one. Luckily, the Caymans is a fair place. I was found under exceptional circumstances during my trial. AKA the judge and the courts reviewed the case and agreed it was an accident. I still love the island. It's probably my favorite place to vacation. Just check your luggage before you go. Ha. My point is this: be present. Enjoy your life. One day something could happen — even by complete accident — and yoink it all away. I have so many friends who'll read this and by all definition live a "dream life" — and yet are dissatisfied just like I was. If anything this is the default for most successful men. Not the exception. I'm writing this to help you stop. It took god slapping me across the face with my own ignorance to see it. It was painful and scary. Dark. But honestly, it was the greatest blessing I've ever received. I'm writing this from my office at home, giddy as absolute fuck about my life and everything I have the option to do today. If anything, I'm sad about how much time I wasted feeling otherwise. Don't be ignorant and stupid like me. You might not get the blessing of a 15-year prison threat in a foreign country to wake you up. Wake up. Appreciate what you have now.
English
1.2K
559
9.6K
614.8K
OnchainDeFi retweetledi
Milk Road AI
Milk Road AI@MilkRoadAI·
Jensen Huang has been publicly endorsing Nebius for months and today, Nebius proved he was right (Save this). This morning Nebius reported Q1 2026 earnings and they were a full blown statement. Revenue came in at $399 million, up 684% year over year, against a consensus estimate of $388 million. Adjusted EPS came in at $2.11 against Wall Street's estimate of negative $0.78. Adjusted EBITDA margin in the AI cloud business nearly doubled quarter on quarter to 45%. Contracted capacity now exceeds 3.5 GW, surpassing their 3 GW target and they've raised guidance to more than 4 GW by year-end 2026. Full year revenue guidance is $3.0–3.4 billion and full-year ARR guidance has been reaffirmed at $7–9 billion. $6.3 billion in capital has been secured including convertible notes and a direct equity investment from Nvidia itself. The growth acceleration is the part nobody expected. Q4 2025 revenue grew 547% year over year while Q1 2026 grew 684% and that rate is accelerating. Nebius went from $50.9 million in revenue one year ago to $399 million today. Wall Street analysts currently project revenue growing from $530 million at year end 2025 to $9.7 billion by the end of 2027, nearly a 20x increase in two years. The company is also executing aggressively on physical infrastructure. Yesterday, Nebius announced it has secured 1.2 GW of power and land for a new owned AI factory in Pennsylvania bringing its total number of sites exceeding 100 MW to seven. It also recently announced a 310 MW AI factory in Lappeenranta, Finland projected to be one of Europe's largest dedicated AI facilities and has supply agreements totaling over $40 billion with Microsoft and Meta. The company plans to go from 7 operational data centers in 2025 to 16 by end of 2026 and this is what Jensen saw when he said "Nebius will take care of you." The Milk Road Pro analysts have had Nebius on our radar since the early days, and it’s been one of the clearest pure play AI infrastructure stories in the public markets. Today’s earnings just validated everything, and our position is now up massively since we first added it. This is exactly why you should come join Milk Road Pro. Link below!
Milk Road AI@MilkRoadAI

he Nebius CEO just confirmed what the entire AI industry already knows. There is not enough compute, and the companies that don't own their own capacity simply cannot grow as fast as they could. Data center GPU lead times are running 36 to 52 weeks, inference workloads alone will account for two thirds of all AI compute demand in 2026 up from one third in 2023. The AI data center GPU market is projected to grow from $111 billion today to $323 billion by 2030, a 23.8% CAGR. And even at current scale, the most optimized AI data centers are running at 60–85% GPU utilization meaning demand is already pressing hard against supply. Nebius isn't waiting for someone else to build the capacity, Arkady Volozh is doing it himself. The company exited 2025 with $1.25 billion in ARR 503% year over year growth and is guiding for $7–9 billion ARR by end of 2026. Nine new data centers announced and over 2 gigawatts of contracted power secured, with plans to exceed 3 gigawatts. A $27 billion, 5 year infrastructure deal with Meta to serve GPU dense workloads. Sales pipeline on track to exceed $4 billion in early 2026, with new contract durations extending by 50% and then the Eigen AI acquisition changed the thesis entirely. For $643 million, Nebius bought one of the most advanced inference optimization stacks in existence system level, model- evel, and kernel level techniques that deliver higher throughput and lower latency without adding complexity for customers. Volozh said it directly, they're not just selling bare metal or renting GPUs by the hour but rather building the full inference stack. That's where the next layer of margin lives. Milk Road Pro remain massively bullish on Nebius and the neocloud trade. Milk Road Pro analysts have held large positions and are up significantly this thesis keeps compounding with every data point Arkady puts out. Earnings drop May 13. Come join us at Milk Road Pro before then, link below

English
5
14
109
36.8K
OnchainDeFi retweetledi
Kyle Reidhead | Milk Road
Kyle Reidhead | Milk Road@KyleReidhead·
The AI bull market is moving in phases --> it started with hyperscalers and data centres --> then compute --> now it's memory the next bottleneck will be (already is) energy, but not from the grid, it will be on-site energy and batteries after that? it's time for robotics and then space this bull market isn't ending anytime soon, it's just going to keep evolving from one bottleneck to the next from one innovative solution to the next staying ahead of the phases is the most important thing that's what the analysts at Milk Road PRO are for you can track their trades and portfolios live and read their research whenever you want for just $1 (link in bio) don't try and take on this bull market alone, let us help
Kyle Reidhead | Milk Road@KyleReidhead

we built a market intelligence platform that gives our analysts super powers they were early to $AMD they were early to $MU they were early to $BE and they're already buying into a new sector no one is talking about👀 you can follow their portfolios for $1 at Milk Road PRO why try to navigate this bull market alone? this is one of the greatest wealth creation opportunities of our lifetime don't mess this up

English
3
10
35
16.1K
OnchainDeFi retweetledi
Bankr
Bankr@bankrbot·
new in your bankr terminal: per-automation model selection. each scheduled automation can run on its own model. opus for the heavy 9AM research task, gemini flash for the hourly check. 20 concurrent automations for $BNKR club now live @ bankr.bot/terminal
English
34
19
106
10K
OnchainDeFi retweetledi
Coin Bureau
Coin Bureau@coinbureau·
🚨 CLARITY ACT BRINGS STABLECOIN REWARDS FIGHT THIS WEEK IN THE SENATE The Senate Banking Committee has scheduled a CLARITY Act markup hearing for Thursday, May 14. Lawmakers appear ready to move forward with a compromise on stablecoin rewards. The deal would ban yield on static stablecoin reserves, but still allow rewards tied to stablecoin activity. Banks still have concerns with this compromise, but the hearing suggests the bill may move forward anyway.
Coin Bureau tweet mediaCoin Bureau tweet media
English
80
150
925
60.8K
OnchainDeFi retweetledi
Milk Road AI
Milk Road AI@MilkRoadAI·
The All In Pod just framed the Anthropic- SpaceX compute deal better than anyone else has. Everyone focused on the fact that Anthropic, a company Elon Musk has publicly called evil and misanthropic is now renting his data center. But look at the structure Elon built underneath it. Shaun Maguire’s tweet says it, SpaceX isn’t a rocket company but rather a five layer platform play, Launch, Connectivity, Compute, Applications, and Other Bets. Layer 3 is where this deal lives and it changes everything about how you value the whole stack. Here’s what the infrastructure actually looks like. Colossus 1 in Memphis, the one just leased to Anthropic has 300+ megawatts, 220,000 GPUs across H100, H200, and GB200. Macro-Hard, known internally as Colossus 2, is targeting 1 gigawatt of capacity with 550,000 Blackwell GPUs. Macro-Harder, an 810,000 square-foot former warehouse in Southaven, Mississippi pushes the total campus toward 2 gigawatts and over 1 million GPUs. He gave Anthropic the older cluster, the H100s, great for inference, less critical for next-gen training while he kept the Blackwell capacity for Grok-5. He monetized the sunk cost without giving away the crown jewel. The All-In read is that this deal generates an incremental $4–5 billion of revenue this year alone, on top of analyst estimates already in the mid-20s. That’s the answer to the xAI valuation question that has hung over every roadshow conversation. Elon wasn’t building an AI lab with expensive infrastructure but rather building EWS, Elon Web Services. The Anthropic deal is just the first customer announcement.
English
18
99
936
299.9K
OnchainDeFi retweetledi
Elton
Elton@eltoniselton·
updated hyperliquid:native valuation using onchain data via mcp
Elton tweet media
English
31
60
582
68.9K
OnchainDeFi retweetledi
OpenAI
OpenAI@OpenAI·
Introducing GPT-Realtime-2 in the API: our most intelligent voice model yet, bringing GPT-5-class reasoning to voice agents. Voice agents are now real-time collaborators that can listen, reason, and solve complex problems as conversations unfold. Now available in the API alongside streaming models GPT-Realtime-Translate and GPT-Realtime-Whisper — a new set of audio capabilities for the next generation of voice interfaces.
English
689
1.4K
14.8K
3.5M
OnchainDeFi retweetledi
Eleanor Terrett
Eleanor Terrett@EleanorTerrett·
🚨SCOOP: The Senate Banking Committee is preparing to notice a markup for the Clarity Act as soon as tomorrow and has circulated draft legislative text to select industry members ahead of a potential Thursday vote, according to multiple industry sources who have seen the text. The language is reportedly still being finalized, with additional edits expected to reflect priorities from Democratic offices. One source told me the overall vibes after reviewing the bill and coordinating with other industry leaders are positive so far, though some bracketed sections are raising concerns that key provisions previously thought to be settled may still be in flux.
English
149
874
4.4K
750.9K
OnchainDeFi retweetledi
Milk Road AI
Milk Road AI@MilkRoadAI·
The most in demand AI product in the world just ran out of compute because it grew 80x in a single quarter (Save this). Anthropic planned for 10x growth but they got 80x. That single stat is why Claude has had rate limits, wait times and capacity constraints. You cannot build data centers fast enough when demand is outpacing your most aggressive forecast by 8x so Dario went out and signed every compute deal available. Amazon, a $100 billion, 10 year commitment to AWS securing up to 5 gigawatts of new capacity including next generation Trainium chips plus international expansion across Europe and Asia. Broadcom and Google, a long term partnership to develop custom AI chips on Google TPUs, giving Anthropic diversified silicon at scale. And now, SpaceX who is giving access to all of Colossus 1 in Memphis 220,000 NVIDIA GPUs, 300 megawatts of capacity with a longer-term vision to develop multiple gigawatts of orbital compute capacity together. That last one matters more than people realize. Elon didn't give Colossus access to just anyone, Anthropic is now running at a $30 billion annual rate, with 70% of the Fortune 100 on Claude and 1,000+ enterprise customers spending over $1 million per year. SpaceX moved its own training workloads to Colossus 2 specifically to free up Colossus 1 for commercial leasing and Anthropic was first in line. "We're going to keep trying to obtain even more compute to pass on to you. We're sorry if it takes some time but we're going to acquire as much as we can." The compute race isn't slowing down but rather accelerating. And every company supplying the infrastructure, NVIDIA, Amazon, Broadcom, SpaceX just got a massive new tailwind from the lab that no one expected to win. The Milk Road Pro portfolio was built for exactly this tailwind and that's why some of our positions are up over 100% this month. If you want the full thesis, check out Milk Road Pro, link below.
Milk Road AI@MilkRoadAI

Chamath called this shot almost word for word. Before the Cursor deal, before the Anthropic agreement, before anyone in mainstream finance was talking about SpaceXAI as a compute provider, @Chamath laid out exactly what was about to happen. There is a catastrophic mismatch between the gigawatts of AI compute that have been announced and what is actually being built and less than half of it is under real construction. The rest is stuck in permitting and grid backlogs with no credible path to coming online. That supply crunch would hurt Anthropic and OpenAI the most, two frontier labs with enormous demand and no owned infrastructure and it would create a massive lane for SpaceX to run through because Colossus has excess capacity right now that every compute starved lab desperately needs. "The Cursor deal was the appetizer. He and Dario should do a deal tomorrow" and they just did. SpaceXAI has signed an agreement with Anthropic to provide access to Colossus 1 over 220,000 NVIDIA GPUs including H100, H200, and GB200 accelerators. Anthropic will use the compute to directly improve capacity for Claude Pro and Claude Max subscribers and buried in the agreement is the part that signals where this is all heading. Anthropic expressed interest in partnering with SpaceX to develop multiple gigawatts of orbital AI compute capacity. Every terrestrial constraint, land, grid capacity, cooling, permitting disappears in orbit. SpaceX is the only organization on Earth with the launch economics and constellation experience to make space-based compute a near-term engineering program rather than a research concept. Chamath told you to follow the dollars flowing out of the hyperscalers and buy the companies receiving them and SpaceXAI just became the clearest example of that thesis playing out in real time.

English
16
38
174
80.9K
OnchainDeFi retweetledi
Milk Road AI
Milk Road AI@MilkRoadAI·
This is WILD! Elon Musk just turned Colossus into one of the most consequential compute platforms in AI and the latest deal proves it. SpaceXAI has signed an agreement with Anthropic to provide access to Colossus 1, one of the world's largest and fastest-deployed AI supercomputers. Built in 122 days in Memphis, Colossus 1 houses over 220,000 NVIDIA GPUs, including H100, H200, and next-generation GB200 accelerators and delivers extreme parallel performance for training, fine-tuning, and inference at frontier scale. Anthropic plans to use the additional compute to directly improve capacity for Claude Pro and Claude Max subscribers. Here's the deeper story. Anthropic already committed more than $100 billion over the next decade to AWS and is burning through compute at a rate that no single provider can fully satisfy on the timelines that matter for staying at the frontier. The compute required to train and operate the next generation of these systems is outpacing what terrestrial power, land and cooling can deliver. That supply constraint is forcing Anthropic to diversify aggressively and Colossus is exactly the kind of immediately available, proven at scale infrastructure they need right now while their long-term AWS capacity comes fully online. But the most forward-looking piece of this agreement isn't what's happening on the ground but rather what they're planning above it. As part of the deal, Anthropic expressed interest in partnering with SpaceX to develop multiple gigawatts of orbital AI compute capacity. SpaceX is the only organization on Earth with the launch cadence, the mass to orbit economics, and the constellation operations experience to make space based compute a near-term engineering program rather than a research concept. If the engineering challenges can be solved, thermal management in vacuum, radiation hardening, high-bandwidth laser interconnects back to Earth, orbital compute offers near-limitless sustainable power with zero land constraints and dramatically less environmental impact. Starlink already demonstrates that SpaceX can operate and maintain a constellation of thousands of interconnected nodes at scale and the leap from communications satellite to compute satellite is large, but SpaceX is the only entity positioned to attempt it. The broader picture here is that SpaceXAI is no longer just a rocket company or an AI model company but rather rapidly becoming the most vertically integrated compute provider in the world. Colossus is already renting capacity to Cursor, Anthropic, and others while SpaceX prepares for its IPO at a reported $1.75 trillion valuation. Every deal like this adds another revenue layer to Colossus, increases utilization on infrastructure that exists regardless of whether it's rented, and builds the commercial case for orbital compute as the next generation of the buildout. The AI infrastructure race has moved off the grid, off the land, and now with this agreement officially off the planet.
Milk Road AI tweet media
English
19
34
139
226.7K
OnchainDeFi retweetledi
DefiLlama.com
DefiLlama.com@DefiLlama·
We compiled 30 prompts to turn LlamaAI into your personal crypto analyst. Find undervalued tokens, stress-test trade ideas, and run deep research. We ran them all. Every result is below in this thread. Steal them.
DefiLlama.com tweet mediaDefiLlama.com tweet mediaDefiLlama.com tweet mediaDefiLlama.com tweet media
English
35
67
643
98.8K