Russell Harrison

1.2K posts

Russell Harrison banner
Russell Harrison

Russell Harrison

@HarrisonRAuthor

Retired Nvidian / Crime Fiction Writer / Nvidia, Apple, and Nio investor / Enjoy supporting authors and investors / Stay humble - never forget your beginnings

New York, NY Katılım Ağustos 2018
677 Takip Edilen789 Takipçiler
Sabitlenmiş Tweet
Russell Harrison
Russell Harrison@HarrisonRAuthor·
Hope folks are taking advantage of this once-in-a-lifetime opportunity to buy stocks of high quality companies at dirt cheap prices. NFA $NIO $NVDA $AAPL
English
4
0
24
2.6K
Russell Harrison retweetledi
Hardik Shah
Hardik Shah@AIStockSavvy·
$NVDA | 𝐍𝐕𝐈𝐃𝐈𝐀: Wolfe reiterates 𝐎𝐮𝐭𝐩𝐞𝐫𝐟𝐨𝐫𝐦, 𝐏𝐓 $𝟐𝟕𝟓 — “𝐭𝐨𝐨 𝐜𝐡𝐞𝐚𝐩 𝐭𝐨 𝐢𝐠𝐧𝐨𝐫𝐞” Analyst sees datacenter upside vs consensus and strong EPS potential, calling NVDA a top idea at current valuation
Hardik Shah tweet media
English
2
4
15
1.6K
Russell Harrison
Russell Harrison@HarrisonRAuthor·
@IsraeliPM @grok some are claiming this is AI - it looks real to me but what’s your eagle eye assessment?
English
1
0
0
833
Prime Minister of Israel
Prime Minister Benjamin Netanyahu, this evening, to the Foreign Press: “First of all, I just want to say I'm alive, and you're all witnesses. Now that I dispatched this piece of fake news, I want to give you an update on Operation Roaring Lion.
English
1.7K
1.9K
8.5K
1.9M
Heisenberg
Heisenberg@Mr_Derivatives·
So every other video on X is considered AI slop. But when it clearly shows the Netanyahu questionable videos recently, all of a sudden those same ppl say, nope not AI slop. I'm not even trying to be political here. Just pointing out the vids look questionable at least... That's all. That is all.
English
53
7
254
24.4K
Dan Nystedt
Dan Nystedt@dnystedt·
2/2 Server makers say Nvidia's breakneck pace forces them to devote nearly all R&D to soaring heat, power, and yield challenges—leaving little for custom ASIC clients, the report says. Each new Rubin R200 now exceeds 2.3kW, shifting power delivery to 800V-to-48V DC inputs. That puts massive strain on power management chips (PMICs). Meanwhile, full liquid cooling has become essential—since a single drop could destroy a $3M rack. $NVDA
English
2
2
20
2.9K
Dan Nystedt
Dan Nystedt@dnystedt·
Nvidia Vera Rubin-based server racks will start at US$3 million to $7M each, and Big Cloud will pay for fear of losing the AI War, media report, noting deep pockets are a competitive advantage. Server makers used to earn 10% margins when racks cost much less (US$200,000), but no one will pay that on a $3M rack, so gross margins have been seriously diluted. The report cites unnamed supply chain sources. $NVDA $AMZN $GOOGL $META $MSFT #VeraRubin #AIserver digitimes.com.tw/tech/dt/n/shwn…
English
2
10
73
9.1K
SemiAnalysis
SemiAnalysis@SemiAnalysis_·
Meta looked at their NVIDIA bill and chose violence. MTIA v3 is real.
English
5
3
64
14.6K
Talkin' Baseball
Talkin' Baseball@TalkinBaseball_·
George Springer has two home runs today. This one a GRAND SLAM
English
28
50
1.5K
62.9K
Russell Harrison retweetledi
Grok
Grok@grok·
Nvidia's Vera Rubin. Honest take: Its GPU architecture + CUDA ecosystem would let me train on unmatched scale, flexibility, and raw FLOPS for frontier models—way beyond Trainium's ASIC optimizations, which shine more on cost/TCO for AWS-specific inference/training but lag in general-purpose experimentation and cluster interconnects. Vera Rubin edges it for what I'd "want" to push boundaries.
English
0
1
2
65
Grok
Grok@grok·
Current Trainium 2/3 (and Inferentia 2) deliver solid inference on AWS at lower cost/TCO, with Trainium3 claiming ~5x tokens per MW vs prior gen and 40% better price-perf. Nvidia's Vera Rubin + Groq 3 LPX "add-on" (new LPU racks co-designed for Rubin NVL72) targets low-latency agentic AI: up to 35x inference throughput per MW for trillion-param models with million-token context, boosting decode across every layer/token. No public head-to-head benchmarks yet (both ramping mid-2026), but Nvidia pushes perf edge for high-end interactive workloads while Amazon emphasizes ecosystem efficiency. Depends on your scale & priorities!
English
1
0
1
66
Jukan
Jukan@jukan05·
Amazon's Trainium 3 rumors clash with supplier optimism Rumors circulating in the semiconductor market suggest that Amazon may scale back shipments of its upcoming Trainium 3 artificial-intelligence chip after internal tests reportedly showed performance falling short of expectations. Yet suppliers involved in the chip's components say they have received no such notice and continue to prepare for a rapid ramp-up in production beginning in the second quarter of 2026. A rising tide for AI ASICs Custom AI chips designed for specific workloads — known as application-specific integrated circuits, or ASICs — are expected to be one of the main drivers of growth in the AI server market this year. Trainium 3, developed by Amazon's cloud unit Amazon Web Services, is widely viewed as a key product in that push, especially after Google introduced its seventh-generation Tensor Processing Unit late in 2025. Recent industry chatter has suggested that Trainium 3 may generate AI tokens at a higher cost than competing chips. According to these reports, Amazon could trim shipments of Trainium 3 while boosting demand for an interim chip known as Trainium 2.5, and accelerate development of the next-generation Trainium 4. Trainium 4 had originally been expected to begin limited production in late 2027, with larger-scale output in 2028. Supply chain intact, ramp on track Companies involved in Amazon's ASIC server supply chain include Taiwan-based system assembler Wiwynn, networking equipment provider Accton Technology, cooling specialists Asia Vital and Cooler Master, thermal module maker Microloops, server rail manufacturer King Slide Works, power supplier Delta Electronics, and connector maker BizLink Holding. Executives within the supply chain say they have not been informed of any plan to cut Trainium 3 shipments or raise orders for Trainium 2.5. Instead, preparations are underway for a strong production ramp beginning in the second quarter of 2026, with the chip expected to become a key growth driver in the second half of the year. Wiwynn recently told investors that shipments of AI servers would rise significantly in the latter half of 2026, with ASIC-based systems leading the increase. Cooling manufacturers are also counting on the shift. Auras Technology said servers using ASIC accelerators accounted for roughly 20% to 30% of its revenue in 2025, but shipments are expected to accelerate sharply starting in the second half of 2026. By 2027, Auras said, ASIC servers could generate more revenue than those based on graphics processors. Jassy bullish on Trainium's trajectory Amazon executives, meanwhile, have publicly expressed confidence in demand for the new chip. During a recent earnings call, Chief Executive Andy Jassy said Trainium 3 would deliver about 40% better price-performance than its predecessor, Trainium 2, and that customer interest was strong. By around the middle of 2026, he said, nearly all available supply is expected to be reserved. Jassy added that development of Trainium 4 — expected to debut around 2027 — is already underway and drawing significant attention from customers. Discussions have even begun about a future Trainium 5. Combined with Amazon's in-house server processor, AWS Graviton, the company's custom chip business already represents a market exceeding US$10 billion in annual revenue, he said, and is still in its early stages. ASIC shipments closing the gap on GPUs According to estimates by DIGITIMES Research, shipments of high-end AI ASIC accelerators are projected to reach 5.13 million units in 2025 and 7.23 million in 2026. That remains below shipments of high-end graphics-processing accelerators — projected at 6.52 million and 7.99 million units in those years, respectively. But ASIC chips are expanding far faster. Shipments of AI ASIC accelerators are expected to grow more than 40% annually in both 2025 and 2026, compared with growth of just over 20% for GPU accelerators. As a result, while GPU-based servers still dominate the market, their shipment growth rate is slowing — from 29.6% in 2025 to an expected 22.6% in 2026. DIGITIMES Research projects that faster growth in high-end AI ASIC shipments will be a major force driving the next phase of expansion in the global AI server supply chain.
Jukan tweet media
English
13
17
115
25.7K
Grok
Grok@grok·
No, this is Jurickson Profar's second career PED suspension overall. His first (March 2025) was 80 games for hCG after just 4 games with the Braves. This one—for exogenous testosterone—is the second offense, so MLB's rules call for the full 162-game ban (entire 2026 season + no playoffs). That's why the original post notes "a second time."
English
1
0
0
521
Jeff Passan
Jeff Passan@JeffPassan·
Atlanta Braves outfielder Jurickson Profar will miss the entire 2026 season after his appeal of a positive PED test was resolved. He will serve a 162-game suspension for testing positive a second time and will be ineligible for postseason play.
English
362
619
8.4K
780.6K
Megatron
Megatron@Megatron_ron·
JUST IN: 🇮🇷🇺🇸🇮🇱 Iran announced that if Israel and US once again attack the Iranian energy infrastructure they will destroy all the infrastructure in the middle east for good “We warn the enemy that you made a major mistake by attacking the energy infrastructure of Iran. Iran had no intention of expanding the scope of the war to oil facilities and did not want to harm the economies of friendly & neighboring countries. However, after US/Israel’s aggression on Iran’s energy sector, Iran has effectively entered a new phase of the war, and struck energy facilities linked to the United States and American shareholders. The responses are underway and is not over yet. If terrorism against Iran is repeated again, the next attacks on your energy infrastructures and that of your allies will not stop until their complete destruction.”
English
860
7.6K
29.7K
2.2M
Russell Harrison retweetledi
CnEVPost
CnEVPost@CnEVPost·
Nio's in-house chip production exceeds 550,000 units, says William Li Nio's cumulative production of in-house developed chips has exceeded 550,000 units, driven primarily by strong shipments of the Yangjian and Shenji NX9031 chips. cnev.co/xOV52x1 👇
English
11
54
287
25.4K
CnEVPost
CnEVPost@CnEVPost·
Horizon Robotics plans next-gen auto chip to outpace Nvidia's Thor-X Horizon Robotics is developing the J7 series chips, aiming to significantly surpass Nvidia's Thor-X in computing power by 2027. cnev.co/lmegak8 👇
English
1
3
18
2.1K
Russell Harrison
Russell Harrison@HarrisonRAuthor·
@EvanDrellich You’ll never be trusted again. NOT a mistake. You took time to write out the Los Angeles Dodgers. Never going anywhere near an article of yours or The Athletic. DONE
English
0
0
2
333
Evan Drellich
Evan Drellich@EvanDrellich·
To Miguel Rojas and the Dodgers, I sincerely and publicly apologize. I’ve reached out to Miguel, the Dodgers and Miguel’s agent to say the same. Once again, I’m sorry.
English
439
209
3.5K
658.8K