Hustler D

1.5K posts

Hustler D banner
Hustler D

Hustler D

@_hustlerdoc

MD + investör | retweets better than my own tweets

Europe Katılım Temmuz 2023
117 Takip Edilen356 Takipçiler
Sabitlenmiş Tweet
Hustler D
Hustler D@_hustlerdoc·
Front load your investment, then take profit if successful from your entry and let the house money ride for life and compound! This massively de-risks you and allows freeing up of capital and the game to continue LFG 🫡
English
2
0
4
2.2K
Hustler D
Hustler D@_hustlerdoc·
Some chart musings $AMPX perfect bounce on the 21ema weekly, nil concerns, great quarter report. Holding and will add again on any technical weakness.
Hustler D tweet media
English
0
0
1
23
ma26nu
ma26nu@man26uel·
SILEX Microsystem IPOed yesterday $SILEX Beyond the stock’s rocket performance yesterday, this IPO uncovers far more. It’s a story about re-shoring of critical MEMS foundry capacity used in various AI and Quantum apps. Who ever is interested in Silex: I used the word Silex 16 times in my article “The Glass Substrate Reality Check: Organic’s Reign, TSMC’s Bottleneck and the True AI Niche”, linked below, and outlined the Mission Apollo supply chain. There are very strong connections for Silex with Optical Circuit Switching (OCS), Lumentum, PlanOptik and many more. It’s not just another fab; it’s the biggest pure play MEMS foundry worldwide and is sitting basically on a goldmine right now.
ma26nu tweet media
ma26nu@man26uel

The Glass Substrate Reality Check: Organic’s Reign, TSMC’s Bottleneck and the True AI Niche ⬇️ bit.ly/4b1m6Mc

English
2
8
79
61.1K
Hustler D
Hustler D@_hustlerdoc·
Homework
dylan ツ@demian_ai

The geometry of thought. Every LLM on earth can speak fluent English. None of them think in English. I have been trying to find a way to explain this to a non-technical friend for about a year, and I have mostly failed, because the standard explanation requires the listener to picture an abstract space they have never seen. The breakthrough I finally landed on came from an old map. In 1569, a Flemish cartographer named Gerardus Mercator published the projection of the world that bears his name. The Mercator projection takes the surface of a sphere and prints it on a flat rectangle, in a way that preserves angles but distorts areas. Greenland looks the size of Africa even though Africa is fourteen times larger. Antarctica becomes an enormous strip along the bottom of the map. The proportions of the world, in the Mercator projection, are confidently and consistently wrong. We kept using it anyway, for four hundred years, because it has one priceless property. If you draw a straight line on a Mercator map, that line is a constant compass bearing. A captain in 1600 could plot a route from Lisbon to Recife with a ruler and a protractor and arrive somewhere close to where he intended. The Mercator projection is wrong about what the world looks like. It is right about how to navigate the world. We agreed, collectively, to lie about the shape of the planet in exchange for being able to find our way around it. This is what LLMs do with thought. Inside any modern frontier model, concepts do not live as words. They live as positions in a very high-dimensional space, with a particular geometric structure. Goodfire's recent work, which is the clearest public demonstration of this, shows the shape directly. Colors form a different shape, more like a sphere. Spatial concepts curl into manifolds that match physical space. The concept of a car is a complicated multidimensional surface that connects, in geometrically meaningful ways, to the concepts of motion, of metal, of road, of journey. The model does not store these concepts as text. It stores them as geometry. When you type a question to it, the model maps your words onto positions in this internal space. It then performs operations on the geometry, which produce new positions. Then, only at the very end, it translates those new positions back into English on the way out to your screen. The English is the Mercator projection. The geometry is the globe. This sounds abstract until you realize what it implies for almost every interaction you have ever had with a model. Why does GPT sometimes give a brilliant answer in one phrasing and a mediocre one in another, even though both phrasings mean the same thing to a human reader? Because the two phrasings land on slightly different positions in the internal geometry, and the geometry near one position is richer than the geometry near the other. Why does a model sometimes confabulate confidently? Because the position it lands on has the geometric texture of an answer even though the answer it generates has no factual grounding. The shape of an answer and the truth of an answer are different things, and the model is trained on the shape. Three implications follow from this and they reach much further than most of the discourse about AI suggests. 1. for product builders. If you have ever wondered why the same model produces wildly different outputs on prompts that seem semantically identical, the answer is geometric. The most reliable way to improve model output is not to tinker with the words. It is to find the regions of geometric space where the model behaves well, and engineer your prompts to land you there. The best prompt engineers, without knowing it, are reverse-engineering the topology of the model's internal world. This is also why fine-tuning works better than prompting for many use cases. Fine-tuning literally reshapes the geometry. Prompting only steers within it. 2. for the safety and interpretability community, which has spent two years looking for circuits and individual neurons that correspond to specific concepts. That work has been valuable, but it was looking at the shadow on the wall. The actual structure is at the manifold level, not the neuron level. The next leap in interpretability is going to come from learning to edit the geometry directly, not by adjusting individual weights. We are about to move from steering the words to steering the shapes that produce the words. This will make some kinds of safety work much easier and other kinds much harder. 3: for everyone else, and it is the strangest one. The early evidence from neuroscience suggests that human thought may have the same kind of geometric structure. The hippocampus appears to encode spatial relationships on manifolds that look uncannily similar to what we see inside language models. Concept representation in the human cortex appears to be geometric in roughly the same sense. If this holds up, and the evidence is still preliminary, then the conventional framing of the difference between artificial and biological minds is wrong in an interesting way. It is not silicon versus carbon. It is two different physical substrates that have independently discovered the same mathematical language for representing the world. We built something that thinks the way we think. We just never noticed, because we were too busy listening to it talk. The Mercator projection is wrong about what the world looks like. It is right about how to move through it. The model is wrong about what thought looks like, in some technical sense. It is right about how to do thought, which is the only thing that has ever mattered.

English
0
0
0
12
Hustler D
Hustler D@_hustlerdoc·
@Sol_Sponge The SMART Modular memory business is proprietary they manufacture DDR5 RDIMMs and MRDIMMs. That is IP owning manufacturing. The AMD CPU server business is a deep strategic partnership with a decade of history, direct pull-through from AMD's own sales motion.
English
1
0
0
18
Hustler D
Hustler D@_hustlerdoc·
@Sol_Sponge Bro there is so much more here, this makes it look like PENG has thin integrator margins, no proprietary IP, follow ALAB instead. Whilst it has a second revenue engine that doesn't depend on ALAB at all.
English
1
0
0
81
Sponge
Sponge@Sol_Sponge·
AI has a memory problem. $PENG has the solution. And it’s only a matter of time before the price reflects that. Every long AI conversation builds a “KV cache” inside the GPU. Long docs, agent tasks, big context windows — the cache overflows fast. When it overflows, GPUs recompute, stall, or refuse the request. That’s why inference is so expensive.🧵
Sponge tweet media
English
3
2
15
2.2K
Hustler D retweetledi
🐧
🐧@pennycheck·
$AMPX Small miss on EPS not concerning revenue beat was the key for me Will have thoughts after tomorrow AM earnings call What i personally will be looking for on the call. Most important for me commentary on demand and mix of biz from US drone primes. Color on timeline of moving the NDAA compliant components into production at scale. Also curious if any update on the Amazon accelerator program + if their ramping production of their with a certain unnamed leading EVTol company in California.
🐧 tweet media
English
8
2
106
18.5K
Hustler D retweetledi
루팡
루팡@DrNHJ·
Glass substrate stocks hit limit-up across the board on news that Intel, considered a leader in glass substrate technology standardization, is holding cooperation discussions with Apple. One of the reasons Apple is said to have chosen Intel as a partner is Intel’s next-generation glass substrate packaging technology, “EMIB.” hankyung.com/article/202605…
English
7
5
66
16.9K
Hustler D retweetledi
Nicolas
Nicolas@Superioresearch·
Trade idea: 2-3% of port on 2x levered $SMCI $SMCX headed into earnings later today
English
1
1
19
4.4K
peony
peony@peonyKingOF·
I'll buy $INTT in the 16s all day
GIF
English
3
0
13
1.6K
Hustler D
Hustler D@_hustlerdoc·
@ParadisLabs Nice to know not everyone who has edge alpha is in New york/ the states or Korea. I’m based in London too!
English
2
0
1
317
Paradis Labs
Paradis Labs@ParadisLabs·
Episil-Precision keeps going up 10% everyday lol +112% since my original thesis (The only Episil-Precision investment thesis on X btw)
Paradis Labs tweet media
Paradis Labs@ParadisLabs

Episil-Precision. A Taiwanese pure-play supplier of epitaxial processes inside the global power bottleneck. At less than $500M market cap?! $TSM rely on them for high yields... Even NASA engineers rely on them for GaN devices... Episil-Precision are recognized as the only player in Taiwan able to mass-produce both SiC & GaN epitaxial wafers simultaneously. Acting as a highly convenient one-stop-shop for foundries. The AI power constraint ----- Data centers need gigawatt-scale loads. But modern compute clusters throw up power swings that destabilize traditional grids. EVs demand semis that handle high voltages and temps without performance loss. Traditional silicon has hit its physical threshold. The solution lies in wide-bandgap (WBG) materials: > Silicon Carbide (SiC) > Gallium Nitride (GaN) These operate at 600V-1,200V+ with minimal energy loss. However, their adoption is restricted by manufacturing complexities in the initial growth phase, known as epitaxy. This is where Episil-Precision operates as a mission-critical asset. Episil’s position in the supply chain ----- Episil-Precision is a pure-play supplier of epitaxial processes. It acts as the bridge between raw material substrate vendors and device fabricators. Instead of logic design or volume foundries, Episil handles the front-end material engineering that transforms raw wafers into active electronic platforms. By depositing a crystal layer on a substrate, Episil defines a chip's thickness, resistivity, and dopant profiles. Companies like $NVDA and even NASA engineers rely on Episil to de-risk their operations since this layer determines the final yield of subsequent fab steps. Filling the epitaxial bottleneck ----- The transition to WBG materials has created a huge supply chain bottleneck. Growing GaN or SiC on supporting substrates produces intense mechanical stress due to lattice mismatches. At temperatures >1,000°C, this stress causes the wafer to warp, creating cracks or current collapse. Episil-Precision fills this bottleneck via patented nitride structures that grow alternating buffer layers to absorb mechanical stress. Engineers track structural stress second-by-second in real-time. This ensures that after cooling, the wafer returns to a flat state free from bowing. Without these material capabilities, downstream foundries like $TSM would face sh!t yields, stalling global electrification. Their technological moat ----- Episil's competitive moat is built on 25+ years of institutional process knowledge. Epitaxy is not a commodity; recipes for gas flows and temperature gradients require years of trial and error to perfect. The relationship with clients features high switching costs. In sectors like automotive, qualification cycles take over a year. Once Episil achieves IATF 16949 certification, clients are locked into the supply chain to avoid liability risks. Episil is also the only player in Taiwan capable of mass-producing SiC and GaN epitaxial wafers simultaneously. Competitive dynamics ----- The market features a tier of conglomerates focused on high-volume 300mm bulk silicon for logic (e.g. SUMCO, Shin-Etsu Handotai, GlobalWafers). However, power electronics require a high-mix, lower-volume batch profile on 150mm/200mm lines. The massive capital structures of mega-suppliers are not geared toward mastering the localized stress bottlenecks of WBG processing. Also vertically integrated leaders like $WOLF and STMicroelectronics are overwhelmed by AI demand. To meet backlogs without waiting for new fab construction, these IDMs are outsourcing to independent foundries like Episil. There are some Chinese competitors, but they have consistent defects. Not ideal when some end-clients are EVs. Vanguard alliance is a scaling catalyst ----- Historically, Episil’s growth was capped by capital constraints. This changed when Vanguard International Semiconductor acquired a 13% stake in Episil for approximately $77.1M. Vanguard provides the manufacturing scale for power management chips. Episil provides the WBG expertise. This alliance ultimately grants Episil the gravity to secure volume contracts from Japanese giants like Renesas Electronics and Denso. ----- Episil-Precision is a critical solution provider inside the global power bottleneck. Power systems simply cannot rely on legacy silicon platforms. The transition to WBG materials is non-negotiable. Because the growth of GaN and SiC on supporting substrates produces severe mechanical stress that ruins processing yields, third-party foundries and IDMs are forced to rely heavily on advanced epitaxy specialists. Episil-Precision directly fills this bottleneck by applying its process knowledge,designs, and real-time curvature tracking systems to produce ultra-high-yield starting materials.

English
13
7
169
34.2K
Hustler D
Hustler D@_hustlerdoc·
Must read, we all go through it (albeit at different stages of capital value) 😅
Bracco ⚡️@Braczyy

Read the Market Wizards chapter on Kristjan Kullamägi this weekend. The one section that really stood out was when he discussed his drawdown off of his 2021 peak. "I started 2020 with $3.5 million and ended the year at $36 million. It was a thousand percent year. Then I ran that $36 million to a high of $105 million, and the last portion of that move from $65 to $105 million occurred in just a month and a half. For a brief period, just a few days, I was over $100 million. You have to understand what that did to my psyche. It made me feel completely detached from reality. I thought, “I’m going to get to $200 million in six months.” I was completely sure of that. I started seeing trading as a video game, which I kept winning. Measured from my $105 million peak in November 2021 to my mid-2022 low, I lost approximately $60 million. About half of that loss represented the late 2021 retracement of the large open profits at the November peak to the stops on those positions. The initial retracement loss was so large because I was leveraged long at my peak. My long exposure was $150 million—a number I recall because I remember bragging about it to a friend" These boom and bust type tales are as old as time. Look at Jessie Livermore as the classic example. Net worth of $0 in 1906 to a peak of $1.6 billion (inflation adjusted to 2021 dollars) in 1929. Just 5 years later he blew up and owed $104 million dollars to his brokers... Or look at Paul Tudor Jones. Hit one of the most legendary trades in history, making roughly $200 million dollars during the 1987 crash. It cemented him as a legend. His mental coach Tony Robbins said that Jones consistently lost money for the next 4 years after that peak. Dan Zanger parlayed $10,000 into $42 million during the late 90's. Then in late 2000 he took a 70% drawdown when he was 200% long 3-4 fiber optic stocks as the dot-com bubble was popping. Charles Harris reached 8-figures status after he ran up his account over 4,000% from 2020-21, then experienced a -80% drawdown, mostly due to his big TSLA bet in 2021-2022. I have seen a few people speculating on Kristjans story from the outside. Saying "I would have stopped trading at $100 million" or "I would have just taken that money and started investing". To those people I ask if you have ever experienced a real euphoric run in your trading account, let alone turning 5k into 100mil? Extreme winning streaks like the ones above breed overwhelming euphoria and overconfidence. The mind shifts its focus from process to outcomes, with ego-driven decisions overriding risk parameters and rules. From my experience I have found it near impossible to be aware of this at the peak of the run. It is almost like you are blacked out and the greed/ego completely takes over your trading. Then the drawdown begins. The emotions shift from euphoria and greed to revenge, fear, and doubt. This is where things can really start to spiral out of control. It is only after the drawdown has run its course that you finally come back to your senses and your emotions drift back towards baseline levels. Then all you're left with is regret... Few people ever talk about what a big winning streak can do to you. It can literally change the way you think and operate. Often the ability to achieve super returns is also its biggest drawback—a true double-edged sword. To be able to conquer both sides is the holy grail... From the Hour Between Dog and Wolf by John Coates: "When traders enjoy an extended winning streak they experience a high that is powerfully narcotic. This feeling, as overwhelming as passionate desire or wall-banging anger, is very difficult to control. Any trader knows the feeling, and we all fear its consequences. Under its influence we tend to feel invincible, and put on such stupid trades, in such large size, that we end up losing more money on them than we made on the winning streak in the first place. It has to be understood that traders on a roll are traders under the influence of a drug that has the power to transform them into different people."

English
0
0
0
78
Hustler D
Hustler D@_hustlerdoc·
$ALMU that last monthly candle on this stock was incredible, bouncing off the 21 ema as expected.
Hustler D tweet media
English
0
0
1
267
Hustler D
Hustler D@_hustlerdoc·
This concept then when extrapolated to today with China vs US. Re the long war - china makes 200x ships, grid power capacity is multiple folds etc. The states has recognised this especially with AI as the tailwind to ramp up this energy infra
English
0
0
0
20
Hustler D
Hustler D@_hustlerdoc·
Blitzkrieg was an accident, he argued their hand was forced, germany was playing the short war vs the long war. If they got into the long war they couldn't have competed industrially, it just went really well.
English
1
0
1
25
Hustler D
Hustler D@_hustlerdoc·
youtu.be/zdbVtZIn9IM?si… This last segment he just goes beserk and drops absolute pearls so casually. Reflection is 20/20 but still, incredible.
YouTube video
YouTube
English
1
0
0
49