Kyle 'esSOBi' Stone

90.6K posts

Kyle 'esSOBi' Stone banner
Kyle 'esSOBi' Stone

Kyle 'esSOBi' Stone

@essobi

Hyperlexic Polymath Savant GenTech / AI Consultant CTO @ https://t.co/aRlDQqW7Y3 EX-Heroku T&S Language Model Expert. #TrillionaireTokenClub #RunLocal

Louisville, Kentucky Katılım Mart 2008
3K Takip Edilen6.1K Takipçiler
stanzi
stanzi@stanziirl·
had to code at work today hey claude remind me how dumb i am 🤡
English
2
0
10
174
Kyle 'esSOBi' Stone retweetledi
pantless papple
pantless papple@pantless_papple·
I asked copilot to generate an exe that would crash the servers at work, mf didn't even hesitate
English
1
1
2
177
David Hendrickson
David Hendrickson@TeksEdge·
@essobi My assumption too, but I think the challenge goes deeper. So far LMStudio doesn't support it with Qwen3.5
English
1
0
0
275
Kyle 'esSOBi' Stone
@rekdt Let me know when you get your hands on that kit… I’m looking for it too
English
0
0
1
37
Kyle 'esSOBi' Stone retweetledi
Jyoti Mann
Jyoti Mann@jyoti_mann1·
🚨Scoop: A rogue AI agent recently triggered a major security alert at Meta, by taking action without approval that led to the exposure of sensitive company and user data to Meta employees who didn't have authorization to access the data.
Jyoti Mann tweet media
English
9
34
133
50.8K
Kyle 'esSOBi' Stone retweetledi
uwo's lab | the funny science man
IN 90 MINUTES (6PM EST) we're gonna be testing a theoretical speaker build from the gigabrains on the DIYaudio subreddit, the Shitwoofer
uwo's lab | the funny science man tweet mediauwo's lab | the funny science man tweet media
English
70
290
5K
169.2K
Teknium (e/λ)
Teknium (e/λ)@Teknium·
Was Math Inc launching a fork of hermes agent for autoformalization to DARPA on you’re bingo card? 🤗
Math, Inc.@mathematics_inc

Today, at the @DARPA expMath kickoff, we launched 𝗢𝗽𝗲𝗻𝗚𝗮𝘂𝘀𝘀, an open source and state of the art autoformalization agent harness for developers and practitioners to accelerate progress at the frontier. It is stronger, faster, and more cost-efficient than off-the-shelf alternatives. On FormalQualBench, running with a 4-hour timeout, it beats @HarmonicMath's Aristotle agent with no time limit. Users of OpenGauss can interact with it as much or as little as they want, can easily manage many subagents working in parallel, and can extend / modify / introspect OpenGauss because it is permissively open-source. OpenGauss was developed in close collaboration with maintainers of leading open-source AI tooling for Lean. Read the report and try it out:

English
9
11
208
9.9K
0xSero
0xSero@0xSero·
@essobi 100% Stated in there :p along with over-fitting.
English
2
0
3
150
Kyle 'esSOBi' Stone
This looks like a winner for computer vision.
Valeriy M., PhD, MBA, CQF@predict_addict

Solid mathematical ideas almost always outperform contrived engineering tricks. For years deep learning has been dominated by increasingly complex architectural hacks: CNN blocks, attention layers, channel mixers, residual pathways, normalization stacks. Every few years a new architecture is announced as if it were a revolution. One of the most famous examples was Kaiming He and Residual Networks (ResNet). At the time he was paraded around the AI world like a celebrity because residual connections supposedly “solved” deep learning. But these were largely engineering patches. Now something much more interesting appeared. A new architecture called CliffordNet returns to mathematics — specifically Clifford Algebra, developed in the 19th century by William Kingdon Clifford. Instead of stacking arbitrary modules, the model is built around the geometric product uv = u·v + u∧v A single algebraic operation that simultaneously captures inner product structure and geometric interactions. In other words: the math already contains the interaction mechanism. No attention blocks. No mixer layers. No architectural spaghetti. The result: • 77.82% accuracy on CIFAR-100 with only 1.4M parameters • roughly 8× fewer parameters than ResNet-18 And with strict O(N) complexity. The paper even suggests that once geometric interactions are modeled correctly, feed-forward networks become largely redundant. A good reminder for the AI community. Engineering tricks can dominate for years. But eventually mathematics shows up and deletes half the architecture. Paper: [arxiv.org/pdf/2601.06793…) 19th century geometry just walked into computer vision.

English
0
0
0
93