Hikari Blue

1.5K posts

Hikari Blue banner
Hikari Blue

Hikari Blue

@HikariBlue

The discipline between AI adoption and AI mastery. Intelligence, engineered forward.

Austin, TX Katılım Ocak 2018
272 Takip Edilen1.5K Takipçiler
Hikari Blue
Hikari Blue@HikariBlue·
Our visions converge towards the same idea: AI is not a simple technology. It is a test of individual, entrepreneurial and civilizational sovereignty. 1. It amplifies the intention. 2. It accelerates learning. 3. It rewards the initiative. 4. It penalizes passivity. 5. It transforms the allocation of capital, time and intelligence.
English
0
1
1
60
Hikari Blue
Hikari Blue@HikariBlue·
Une IA fiable ne s’intègre jamais par simple adjonction dans une organisation lente ou mal structurée. Elle exige des données robustes, une refonte profonde des workflows, des talents rares, une gouvernance sans faille, ainsi que la capacité financière et managériale d’absorber une courbe en J souvent violente. Or, précisément, la plupart des entreprises ne disposent ni de cette architecture, ni de cette discipline d’exécution.
Hikari Blue tweet media
Français
0
0
0
31
Hikari Blue
Hikari Blue@HikariBlue·
Zuckerberg just described exactly what we’ve been building for the past two years. “OpenAI and Google are building AI. I believe we’ll have many different AI systems. Every company just like it has a website, a phone number, and an email address will also have an AI that interacts with its customers.” The real battle is no longer about foundation models. It’s about the proprietary operational layer: the one that encodes products, policies, customer history, and the way a company works. Owning a state-of-the-art model is no longer the point. Owning the integration, alignment, and governance layer that transforms a generic model into a system that thinks like your company that’s where the value is shifting. That is precisely the founding thesis of HIKARI BLUE. Intelligence, engineered forward.
English
0
0
0
44
Hikari Blue retweetledi
alphaXiv
alphaXiv@askalphaxiv·
Google's new KV-cache optimization broke the DRAM stocks, but how does it work? Let's take quick a look. "TurboQuant: Online Vector Quantization with Near-optimal Distortion Rate" TurboQuant combines 2 ideas from 2 earlier lines of work: PolarQuant and Quantized Johnson-Lindenstrauss(QJL). PolarQuant shows that switching from Cartesian to polar-style coordinates can kill a lot of the usual quantization overhead, because the transformed variables have a much more structured, concentrated distribution. So you can use fixed scalar quantizers instead of learning lots of extra per-block quantization constants. In TurboQuant, the same core intuition shows up in a slightly different form, where they instead randomly rotate the vector so it looks like a random point on the sphere. Then each coordinate follows a known Beta distribution and is nearly independent in high dimension, so coordinate-wise scalar quantization becomes nearly optimal. On the other hand, they used the clever 1-bit trick from QJL. While plain MSE-optimal quantization reconstructs vectors well, it still gives biased inner products, which is bad for KV-cache use cases. So TurboQuant spends most of the bit budget on the main near-optimal scalar quantizer, then uses the final 1 bit for a QJL sign sketch of the residual to remove inner-product bias. That final 1-bit residual sketch is like a bias-corrector for dot products, which gives an unbiased inner-product estimator while keeping variance low. And these are what got TurboQuant to reduce LLM key-value cache memory by 6x and 8x speedup.
alphaXiv tweet media
English
21
82
458
39.7K
Hikari Blue retweetledi
Franck Ohrel
Franck Ohrel@FranckOhrel·
Yes!! The example is all the more striking as so many mediocre people spend their time attacking @elonmusk. Whether we appreciate him or not: he is part of this very rare category of men capable of thinking at the scale of the century and executing at the scale of industry. Where many comment, he builds. Where many theorize, he deploys. Where many dream small, he transforms extraordinary visions into technological, industrial and operational realities.
Michel de Guilhermier@mitchdeg

Yep, French society as a whole doesn't dream big. I'd even say it has fallen into a kind of Malthusian depression. There are exceptions, of course — including many entrepreneurs I know — but unfortunately, that's the rule of thumb. This is particularly striking among young people, who are supposed to be big dreamers. That’s supposed to be the privilege of youth. I think that's a significant problem going forward. I'd suggest two antidotes: reading as many sci-fi books as possible, or learning to appreciate what Elon Musk is doing — how he dreams big and then turns those dreams into reality.

English
0
1
5
105
Hikari Blue retweetledi
Franck Ohrel
Franck Ohrel@FranckOhrel·
The TERAFAB to be known as the Advanced Technology Fab will be built in Austin.🤘
Franck Ohrel tweet media
English
1
1
5
87
Hikari Blue retweetledi
Google Research
Google Research@GoogleResearch·
Introducing TurboQuant: Our new compression algorithm that reduces LLM key-value cache memory by at least 6x and delivers up to 8x speedup, all with zero accuracy loss, redefining AI efficiency. Read the blog to learn how it achieves these results: goo.gle/4bsq2qI
GIF
English
1K
5.8K
39K
19.3M
Hikari Blue retweetledi
CyrilPaglino
CyrilPaglino@cyrilpaglino·
(French) I just published “Blockchain, protocoles décentralisés et crypto-actifs, ou la plus grande révolution technologique…” @cyrilpaglino/blockchain-protocoles-d%C3%A9centralis%C3%A9s-et-crypto-actifs-ou-la-plus-grande-r%C3%A9volution-technologique-72cc9599a32d" target="_blank" rel="nofollow noopener">medium.com/@cyrilpaglino/…
Français
4
13
40
0
Hikari Blue
Hikari Blue@HikariBlue·
@BenBelais Bravo pour l’excellent article dans Les Échos. Au plaisir d’échanger
Français
0
0
0
0