Matt Hartman

7.5K posts

Matt Hartman banner
Matt Hartman

Matt Hartman

@MattHartman

https://t.co/yuMjrW6q4N. 1st check in @huggingface, backing coder-founders of @modal, @factoryai, @bfl_ml, @flwrlabs &more. fmr @betaworks, 🎹 @sidgoldsreqroom

Austin, TX Katılım Mart 2009
2.1K Takip Edilen9.6K Takipçiler
Sabitlenmiş Tweet
Matt Hartman
Matt Hartman@MattHartman·
I’m releasing 🌶️ Ghost Pepper open-source. All models used are also open source (ty @huggingface) If you like voice to text on Mac but want 100% privacy, try the alpha here github.com/matthartman/gh…
Matt Hartman tweet media
English
12
17
100
14.9K
Matt Hartman
Matt Hartman@MattHartman·
LMK if you’ve canceled your voice-to-text SaaS subscription to use 100% private and open source Ghost Pepper 🌶️
Matt Hartman tweet media
English
2
2
5
196
Matt Hartman
Matt Hartman@MattHartman·
I want to vibe-CAD my next robot. What is the best vibeCAD software to start with that will ultimately help me pick motors, 3d print etc. What's out there that people have tried & like?
English
2
0
5
171
Matt Hartman
Matt Hartman@MattHartman·
@markankcorn Transcription from zoom calls, only recorded in your work computer not processed by cloud LLMs
English
0
0
1
19
Mark Ankcorn
Mark Ankcorn@markankcorn·
@MattHartman Is this from the mic on my laptop or can it use the web / app audio input eg a Zoom call?
English
1
0
0
14
Matt Hartman
Matt Hartman@MattHartman·
🌶️ Ghost Pepper: Meet is here! Ghost Pepper can now transcribe, do speaker diarisation & summarization for your meetings -- 100% local & private. All saved as markdown right on your machine.
Matt Hartman tweet media
English
1
2
13
579
Jacob Bijani
Jacob Bijani@jcb·
@MattHartman Yea been messing with this a bit for some recent personal projects, also designing the actual PCBs in tandem with Claude
English
1
0
2
68
Matt Hartman
Matt Hartman@MattHartman·
Has anyone vibe coded firmware?
English
2
0
2
387
toma
toma@tommaxxing·
@MattHartman what counts?? definitely some quick arduino + ble sniffing
English
1
0
1
40
Matt Hartman retweetledi
Anand Kannappan
Anand Kannappan@anandnk24·
People are misreading the SpaceX/Cursor deal as an M&A story. It’s actually a bet on what the real bottleneck in frontier coding models is. xAI has struggled to close the gap with Claude Code and Codex. Cursor sits on the best corpus of developer traces in the world. The deal lets Cursor train Composer on Colossus while xAI runs the same recipe on Grok. Both sides find out, at the same time, whether Cursor's data is actually the difference. The option structure reflects that uncertainty. If the training work ports over, SpaceX buys Cursor and owns the pipeline. If it doesn’t, they pay $10B for the experiment and walk. Either outcome, Grok ends up stronger than it would have been, and xAI gets an answer to a question it couldn’t answer internally. The part worth holding onto: a pre-IPO company just priced a live experiment to figure out whether real developer traces are the scarce input in coding agents. $10B is what they’re paying to run it. $60B is what the answer is worth if it comes back yes.
Anand Kannappan tweet media
English
60
153
1.5K
253.7K
@jason
@jason@Jason·
We started an AI founder twitter group... reply with "I'm in" if you're a founder and want to be added
English
11K
135
4.6K
888.3K
Matt Hartman retweetledi
Matt Hartman
Matt Hartman@MattHartman·
GitHub in 2000s is a good analogy here: “We are at the same moment in robotics that Linux was for software in the 90s. Or what GitHub did for code collaboration in the 2000s.” - In 2008 when github started there were 200m Americans on the entire internet - 8 years later (2016) it had 20m developers and $150m revenue - today there are 180m GitHub users and 630m repos well over $2bn revenue
Emerson S@Em_Nomadic

The open source robotics movement is moving faster than anyone expected and I don’t think the industry has fully absorbed what’s happening yet. We are at the same moment in robotics that Linux was for software in the 90s. Or what GitHub did for code collaboration in the 2000s. The infrastructure is being built in public, by people who just want to see it exist, and that infrastructure compounds in ways that closed systems simply cannot. The go-to-market for open source robotics is not a product. It’s a community. Someone builds a walking gait. Someone else improves the sim-to-real transfer. Someone else writes the documentation. Someone else 3D prints a modified version and posts the files. Every contribution makes the next person’s starting point better than the last person’s finished product. That loop is what proprietary robotics cannot replicate at any budget. What that means practically is this. The person building a humanoid robot in their basement in 2026 is not starting from scratch. They are starting from thousands of hours of community work, open datasets, pre-trained models, simulation environments, hardware designs, and documented failures. The floor for what a solo builder or small team can achieve just got dramatically higher. @huggingface built LeRobot, an open toolkit for robot learning that became the default starting point for robot research. Then they shipped Reachy Mini, a desktop robot connected to their entire AI ecosystem. @tnkrdotai is building the GitHub for robots. One place where hardware designs, code, training data, and AI models all live together. Open Duck Mini is one of the projects on the platform. A walking bipedal duck under $400. Walking policy trained entirely in simulation. The kind of project that only exists because the open source infrastructure finally made it possible to build and share something like this. The Bimo Project launched a $500 printable bipedal kit with reinforcement learning and sim-to-real pre-configured. By 2027 I genuinely think you will start to see small teams and solo builders shipping capable, task-specific robots that would have required a fully funded lab three years ago. Not because the hardware got dramatically better overnight. Because the open ecosystem around training, simulation, and deployment matured enough that the hard parts got shared. The robot you can build on your workbench for $400 today is more capable than what serious research labs were running a few years ago. And most people still haven’t absorbed what that actually means for where this goes.

English
0
1
9
755
Matt Hartman retweetledi
clem 🤗
clem 🤗@ClementDelangue·
HF becoming the platform for agents (assisted by their humans) to use and build AI (rather than just leveraging APIs)!
Aksel@akseljoonas

Introducing ml-intern, the agent that just automated the post-training team @huggingface It's an open-source implementation of the real research loop that our ML researchers do every day. You give it a prompt, it researches papers, goes through citations, implements ideas in GPU sandboxes, iterates and builds deeply research-backed models for any use case. All built on the Hugging Face ecosystem. It can pull off crazy things: We made it train the best model for scientific reasoning. It went through citations from the official benchmark paper. Found OpenScience and NemoTron-CrossThink, added 7 difficulty-filtered dataset variants from ARC/SciQ/MMLU, and ran 12 SFT runs on Qwen3-1.7B. This pushed the score 10% → 32% on GPQA in under 10h. Claude Code's best: 22.99%. In healthcare settings it inspected available datasets, concluded they were too low quality, and wrote a script to generate 1100 synthetic data points from scratch for emergencies, hedging, multilingual etc. Then upsampled 50x for training. Beat Codex on HealthBench by 60%. For competitive mathematics, it wrote a full GRPO script, launched training with A100 GPUs on hf.co/spaces, watched rewards claim and then collapse, and ran ablations until it succeeded. All fully backed by papers, autonomously. How it works? ml-intern makes full use of the HF ecosystem: - finds papers on arxiv and hf.co/papers, reads them fully, walks citation graphs, pulls datasets referenced in methodology sections and on hf.co/datasets - browses the Hub, reads recent docs, inspects datasets and reformats them before training so it doesn't waste GPU hours on bad data - launches training jobs on HF Jobs if no local GPUs are available, monitors runs, reads its own eval outputs, diagnoses failures, retrains ml-intern deeply embodies how researchers work and think. It knows how data should look like and what good models feel like. Releasing it today as a CLI and a web app you can use from your phone/desktop. CLI: github.com/huggingface/ml… Web + mobile: huggingface.co/spaces/smolage… And the best part? We also provisioned 1k$ GPU resources and Anthropic credits for the quickest among you to use.

English
14
31
228
41.5K
Matt Hartman
Matt Hartman@MattHartman·
The real way AI wins is distrating us with impossible, endless captchas
English
0
0
2
118
Matt Hartman
Matt Hartman@MattHartman·
Me writing code: "2"
Matt Hartman tweet media
English
0
0
3
167
Matt Hartman retweetledi
Git Rated
Git Rated@GitRated·
A new AI review! matthartman/ghost-pepper ⭐3.8/5.0 Ghost Pepper is a thoughtfully-built macOS menu bar app that delivers on a clear promise: on-device speech-to-text, meeting transcription, and local “cleanup” via LLMs—without rely... gitrated.com/matthartman/gh…
English
0
1
0
95
Matt Hartman retweetledi
Zane Chen
Zane Chen@chenzeling4·
This repo feels like a cheat code for private voice on Mac. 🛠️ Ghost Pepper: 100% on-device STT + meeting transcription. Hold Control to talk, release to paste. Whisper on Apple Silicon. Smart cleanup, 50+ languages. No cloud. ⭐ 2.2K #macOS #Privacy
Zane Chen tweet media
English
1
1
1
269
Ryan Hughes
Ryan Hughes@the_ryan_hughes·
Used @superwhisper desktop mac app for many months. Then they disabled their free tier and are forcing me to upgrade to the paid tier. Anyone have any speech to text alternatives for mac?
Ryan Hughes tweet media
English
2
0
0
122
Matt Hartman retweetledi
pyronaur 🔥
pyronaur 🔥@pyronaur·
Ghost Pepper 🌶️ is hands down the best voice transcription app I've used, and I've used a a bunch (Superwispr, WisprFlow, Handy, Openwhispr) Ghost Pepper: ❌ Fancy Website ❌ Marketing ✅ Free ✅ Accurate and Fast ✅ Free and Open Source Thanks @MattHartman
English
0
1
2
338