
Kyle 'esSOBi' Stone
90.6K posts

Kyle 'esSOBi' Stone
@essobi
Hyperlexic Polymath Savant GenTech / AI Consultant CTO @ https://t.co/aRlDQqW7Y3 EX-Heroku T&S Language Model Expert. #TrillionaireTokenClub #RunLocal




It was great to see our name amongst the other “AI Native” companies during @Nvidia’s #GTC keynote. NVIDIA Isaac™ Lab helps us train reinforcement learning policies that enable the UMV to drive, jump, flip, and hop like a pro!




💡 It's still amazing to me that you can run an unsloth version of Qwen3.5-27B on a $2K AMD Ryzen Max+ 395 w/64GB of unified memory @ 10 tps at home. Nearly the same quality as Claude Opus 4 (May, 2025 release)


Apple is urging users to update their iPhones after the discovery of new spyware that can take over phones running older versions of the iOS operating system 🔗 Read more trib.al/VU3qtnH



Today, at the @DARPA expMath kickoff, we launched 𝗢𝗽𝗲𝗻𝗚𝗮𝘂𝘀𝘀, an open source and state of the art autoformalization agent harness for developers and practitioners to accelerate progress at the frontier. It is stronger, faster, and more cost-efficient than off-the-shelf alternatives. On FormalQualBench, running with a 4-hour timeout, it beats @HarmonicMath's Aristotle agent with no time limit. Users of OpenGauss can interact with it as much or as little as they want, can easily manage many subagents working in parallel, and can extend / modify / introspect OpenGauss because it is permissively open-source. OpenGauss was developed in close collaboration with maintainers of leading open-source AI tooling for Lean. Read the report and try it out:


Solid mathematical ideas almost always outperform contrived engineering tricks. For years deep learning has been dominated by increasingly complex architectural hacks: CNN blocks, attention layers, channel mixers, residual pathways, normalization stacks. Every few years a new architecture is announced as if it were a revolution. One of the most famous examples was Kaiming He and Residual Networks (ResNet). At the time he was paraded around the AI world like a celebrity because residual connections supposedly “solved” deep learning. But these were largely engineering patches. Now something much more interesting appeared. A new architecture called CliffordNet returns to mathematics — specifically Clifford Algebra, developed in the 19th century by William Kingdon Clifford. Instead of stacking arbitrary modules, the model is built around the geometric product uv = u·v + u∧v A single algebraic operation that simultaneously captures inner product structure and geometric interactions. In other words: the math already contains the interaction mechanism. No attention blocks. No mixer layers. No architectural spaghetti. The result: • 77.82% accuracy on CIFAR-100 with only 1.4M parameters • roughly 8× fewer parameters than ResNet-18 And with strict O(N) complexity. The paper even suggests that once geometric interactions are modeled correctly, feed-forward networks become largely redundant. A good reminder for the AI community. Engineering tricks can dominate for years. But eventually mathematics shows up and deletes half the architecture. Paper: [arxiv.org/pdf/2601.06793…) 19th century geometry just walked into computer vision.











