heave

70 posts

heave

heave

@heave448

Katılım Mart 2024
61 Takip Edilen0 Takipçiler
kyle yu
kyle yu@brrrkyle·
this is how i wish i learned GPU fundamentals not a lengthy textbook. not a static image. every concept is an interactive visualization. covering the SM architecture, memory coalescing, synchronization, and more. what concepts do you want to see next? brrrviz.com
English
3
8
150
25.3K
Georgi Gerganov
Georgi Gerganov@ggerganov·
llama-server -hf ggml-org/Qwen3.6-27B-GGUF --spec-default
20
55
674
74.5K
Dmitrii Kovanikov
Dmitrii Kovanikov@ChShersh·
Also, I've been playing video games for 28 years, long before Steam. Generational bangers and classics are: HOMM 3 Civilization 4 Prince of Persia Serious Sam NFS: U2 & MW GTA 3 Max Payne Heavy Metal F.A.K.K. 2 Warhammer 40K: Dawn of War SpellForce The Bard's Tale
English
6
0
24
3.9K
Dmitrii Kovanikov
Dmitrii Kovanikov@ChShersh·
I rank all my video games on Steam. These are my absolute favourite S-tier games. Spent 50+ hours at least in each.
Dmitrii Kovanikov tweet media
English
40
1
135
12.2K
heave
heave@heave448·
Trying Nemotron 3 Super in @opencode
heave tweet media
English
0
0
0
41
heave
heave@heave448·
@ThePrimeagen I'll only say don't use dupont wires if you don't want to spent hours debugging a problem that ends up being a bad cable connection.
English
0
0
1
235
Lukasz Olejnik
Lukasz Olejnik@lukOlejnik·
Physicist has written a fascinating big beautiful paper.Let’s not be afraid to call it what it is - groundbreaking. For hundreds of years, mathematics had dozens of “basic” functions: sine, cosine, logarithm, square root, exponential. You know these from school. Everyone does. Now it turns out that all of it is one single operator: E(x, y) = exp(x) - ln(y), and the constant 1. Sin, cos, π - everything follows from this neatly , just nest it properly. Nature hid the simplest possible description of reality. And it was just been found. The whole thing is beautiful and remarkable, here the word “groundbreaking” is not a marketing buzzword. For instance, instead of writing π or 3.14, one can now elegantly write E(E(E(1,E(E(1,E(1,E(E(1,E(E(1,E(E(1,E(1,E(E(1,1),1))),1)),E(E(E(E(E(1,E(E(1,E(1,E(E(1,E(E(E(1,E(E(1,E(1,E(E(1,1),1))),1)),E(E(1,E(E(1,E(E(1,E(E(1,1),1)),E(E(E(1,E(E(1,E(1,E(E(1,1),1))),1)),E(1,1)),1))),1)),1)),1)),1))),1)),E(E(E(1,E(E(1,E(1,E(E(1,1),1))),1)),E(E(1,E(E(1,E(1,E(E(1,E(E(1,E(E(1,E(1,E(E(1,1),1))),1)),E(1,1))),1))),1)),1)),1)),1),1),1))),1))),1)),E(E(E(1,E(E(1,E(1,E(E(1,1),1))),1)),E(E(1,E(E(1,E(1,E(E(1,E(E(1,E(E(1,E(1,E(E(1,1),1))),1)),E(1,1))),1))),1)),1)),1)),1) arxiv.org/abs/2603.21852
Lukasz Olejnik tweet mediaLukasz Olejnik tweet media
English
169
547
4K
1.1M
heave
heave@heave448·
@ChShersh Best course I've done about in-memory DBs is the "In-Memory Data Management" from the @HPI_DE but that was during the golden age of MOOCs (2012)
English
0
0
0
70
Dmitrii Kovanikov
Dmitrii Kovanikov@ChShersh·
Wanna become a C++ expert? Good news, everyone! Join CodeCrafters and build your own low-latency in-memory DB. And because you endured all my shitposts, you get a generous 40% discount. I reward your loyalty. app.codecrafters.io/join?via=chshe… It's time to git gud. #ad
Dmitrii Kovanikov tweet mediaDmitrii Kovanikov tweet media
English
19
22
398
66K
Logan Kilpatrick
Logan Kilpatrick@OfficialLoganK·
Introducing Tab Tab Tab, our new prompt auto complete engine in @GoogleAIStudio's vibe coding experience. Now when you show up with your fuzzy ideas, you can rely on Gemini to fill in the blanks : )
Logan Kilpatrick tweet media
English
98
94
1.3K
63.5K
Dmitrii Kovanikov
Dmitrii Kovanikov@ChShersh·
So, when Brad Pitt wins an Oscar, does he become a bradwinner?
English
6
0
28
3.5K
heave
heave@heave448·
@CUDAHandbook I understand your point but "when the finger points at the moon, the imbecile looks at the finger"
English
1
0
0
27
Nicholas Wilt
Nicholas Wilt@CUDAHandbook·
These types of experiments are inherently dangerous because they exacerbate the very problem they seek to highlight.
Hedgie@HedgieMarkets

🦔A researcher invented a fake eye condition called bixonimania, uploaded two obviously fraudulent papers about it to an academic server, and watched major AI systems present it as real medicine within weeks. The fake papers thanked Starfleet Academy, cited funding from the Professor Sideshow Bob Foundation and the University of Fellowship of the Ring, and stated mid-paper that the entire thing was made up. Google's Gemini told users it was caused by blue light. Perplexity cited its prevalence at one in 90,000 people. ChatGPT advised users whether their symptoms matched. The fake research was then cited in a peer-reviewed journal that only retracted it after Nature contacted the publisher. My Take The researcher made the papers as obviously fake as possible on purpose. The AI systems didn't catch it. Neither did the human researchers who cited it in real journals, which means people are feeding AI-generated references into their work without reading what they're actually citing. I've covered the FDA using AI for drug review, the NYC hospital CEO ready to replace radiologists, and ChatGPT Health launching this year. All of that is happening in the same environment where a condition funded by a Simpsons character and endorsed by the crew of the Enterprise was being presented as emerging medical consensus. The people making these deployment decisions seem to believe the pipeline from research to AI to patient is more supervised than it actually is. This experiment suggests it isn't supervised much at all. Hedgie🤗 nature.com/articles/d4158…

English
1
0
5
1.1K
Jonathan Blow
Jonathan Blow@Jonathan_Blow·
@lefticus Wow, you seriously *cannot* be this obnoxious. Yet here we are.
English
9
3
555
28.4K
heave
heave@heave448·
@lemire Maybe a better solution would be some kind of intermediate representation, something between LLVM IR and NVIDIA PTX.
English
0
0
0
255
Daniel Lemire
Daniel Lemire@lemire·
When Apple moved from Intel processors to its own ARM processors, we did not know how they would handle all the existing Intel software. Then Apple shocked me with its software solution (Rosetta) that could transparently translate x64 binaries to ARM binaries. You just picked your old program, compiled years ago for an old CPU, and it just ran at high speed on a totally different CPU. It seemed to have inspired Intel. One problem when deploying software binaries is that you do not know anything about the processors your clients are using. They could be old CPUs taken from a trash can or the very latest Intel CPU. Thus, when you compile your code, you often target a generic CPU. The net result is that you are not using the fancy features of the newest CPUs. This is especially true under Windows where people have a wide range of systems. That’s frustrating if you are Intel or AMD: you have these new CPUs with features that most software will not use. This is an advantage for systems like game consoles: if you know from the get-go which processor to target, you can optimize better. There are ways around this issue for developers: you can check at runtime for the processor type and then select optimal code. Compilers provide some of this functionality by default. For example, they may have different memory copy functions and switch at runtime depending on the detected system. But compilers can only do so much, and developers do not have a strong incentive to optimize their software for specific CPUs. Doing such runtime dispatching is a lot of work and it complicates testing, thus increasing costs. To make matters worse, nobody will tune their software for processors that are not yet available. Thus, old software may not benefit from more advanced features on newer CPUs. Sure, the developer could recompile the code, but it takes time and money. A secondary but important issue is that compilers are often not great at optimizing even when you tell them which processor to target specifically. It is a matter of incentives: why should Microsoft put a lot of effort into making a family of Intel processors shine? So Intel created something called iBOT (Intel Binary Optimization Tool). It optimizes x64 binaries on the fly. For now, it only works on a few popular games and only for some specific processors. @tomshardware has a great article on the topic where they report an 8% performance boost on average, which is quite impressive given that it comes for free if you are the user. Of course, Intel picked the few games where their techniques worked. How this scales is unclear. Intel keeps making new processors and there is a lot of software around. It would have been more impressive had Intel boosted the performance of software generally. Still: the idea is intriguing.
Daniel Lemire tweet mediaDaniel Lemire tweet mediaDaniel Lemire tweet media
English
49
92
1K
126.6K
Taelin
Taelin@VictorTaelin·
come on, among all the cool tech posts I write, is this really the one that's about to go viral? is this what I'm supposed to be, in this world? cheap entertainment? engagement bait? a living benchmark for your next shiny model, like a mere pawn in this 4d chess board, played and moved by the big labs, as they desperately attempt to justify their inflated valuations and colossal rounds, so I can at least have a place to let my voice be heard and sneak in some cool lambda calculus posts here and there, before it is all irrelevant and obsolete anyway? I guess so anyway is there any Pi extension that lets me call 2 models at once?
English
13
1
154
15.9K
Taelin
Taelin@VictorTaelin·
GPT-5.4: trustworthy math genius, autistic Opus-4.6: charismatic, gets things done, cheats on you Gemini-3.1: walking encyclopedia, licks your boots pick your poison
English
145
138
3.5K
200.1K
Andrej Karpathy
Andrej Karpathy@karpathy·
Thank you Jensen and NVIDIA! She’s a real beauty! I was told I’d be getting a secret gift, with a hint that it requires 20 amps. (So I knew it had to be good). She’ll make for a beautiful, spacious home for my Dobby the House Elf claw, among lots of other tinkering, thank you!!
NVIDIA AI Developer@NVIDIAAIDev

🙌 Andrej Karpathy’s lab has received the first DGX Station GB300 -- a Dell Pro Max with GB300. 💚 We can't wait to see what you’ll create @karpathy! 🔗 #dgx-station" target="_blank" rel="nofollow noopener">blogs.nvidia.com/blog/gtc-2026-… @DellTech

English
531
830
19.2K
1.1M
heave
heave@heave448·
heave tweet media
GIF
ZXX
0
0
0
52
heave
heave@heave448·
@sebkrier They even have a picture of the dog's cancer
heave tweet media
English
0
0
1
418
heave
heave@heave448·
@ThePrimeagen Paying an ai domain for a joke is nextLEVEL
English
0
0
0
32
ThePrimeagen
ThePrimeagen@ThePrimeagen·
If you want to stay out of the drama and migrate to a bug-free framework try NextTXT! My proprietary one-click solution will transfer your entire Vercel OR Cloudflare application to NextTXT, the future of DX. nexttxt.ai
English
56
10
541
59K