David Cox

21.7K posts

David Cox banner
David Cox

David Cox

@neurobongo

VP, AI Foundations @IBMResearch, IBM Director, @MITIBMLab. Former prof @Harvard, and serial/parallel entrepreneur.

Cambridge, MA Katılım Mart 2011
1.9K Takip Edilen11.8K Takipçiler
David Cox
David Cox@neurobongo·
I was telling my kids about how low-key terrifying commercials were when I was a kid, and my daughter made this:
English
0
0
0
678
David Cox
David Cox@neurobongo·
In which Gemini calls the Google AI Ultra product tier a "$230 mistake"
David Cox tweet media
English
0
0
0
689
David Cox
David Cox@neurobongo·
@Google Well, at least you AI knows how your bullshit works, even if you have no customer service department.
David Cox tweet media
English
0
0
1
120
David Cox
David Cox@neurobongo·
Good work, @Google. Even your AI knows how bad you guys are at this stuff.
English
1
0
0
124
David Cox
David Cox@neurobongo·
Gemini tells it like it is re: Google. (my primary gmail account is stuck in some weird state where it thinks I am a business)
David Cox tweet media
English
1
0
1
456
David Cox retweetledi
Rameswar Panda
Rameswar Panda@rpanda89·
At IBM Research, we’re always interested in connecting with researchers and engineers who are passionate about building open and efficient LLMs for enterprise. If this aligns with your interests, you can share your background with our team using: 👉 forms.gle/DFoBFTAS7Cb2ws…
English
0
1
9
841
David Cox
David Cox@neurobongo·
I wanted to send a broadcast message over our Google Home speakers, and the Google Home app told me I had to go to the Gemini app to do that. That didn't work, but I did get to have this unhinged conversation, so consider me a satisfied customer.
David Cox tweet media
English
0
0
1
464
Jeremy Eder 🤘
Jeremy Eder 🤘@jeremyeder·
The whole thing is Apache-2.0. The whole thing. With complete data lineage. It's free. Why would anyone do this, @neurobongo ? 💪 research.ibm.com/blog/ibm-grani…
Red Hat@RedHat

Red Hat CEO @MattHicksJ says open sourcing model weights is just the start: "Openness should begin with model weights, but be further supported by an open ecosystem of tools and platforms that prevent vendor lock-in." Read his argument for genuine open source AI via @Forbes: bit.ly/4pvTCjF

English
1
0
1
501
Alexander Doria
Alexander Doria@Dorialexander·
Breaking: we release a fully synthetic generalist dataset for pretraining, SYNTH and two new SOTA reasoning models exclusively trained on it. Despite having seen only 200 billion tokens, Baguettotron is currently best-in-class in its size range.
Alexander Doria tweet media
English
81
152
1.2K
381.3K
David Cox retweetledi
Sachin Desai
Sachin Desai@sach1n·
This is Granite 4.0 Nano running on an iPhone 17 Pro with MLX Swift. This is the 350M parameter model and the 1B runs equally well. @IBMDeveloper
English
1
4
15
2.8K
David Cox retweetledi
David Cox retweetledi
Qubitium
Qubitium@qubitium·
🏔️Granite 4 Nano quantization support has been added to GPT-QModel! Both H-1B and H-350M W4A16 quantized models now available on HF. 🤗 Dl link in comments. Our eval scores also validate they are indeed tier-one small models. 👇 @neurobongo
IBM Developer@IBMDeveloper

Introducing Granite 4.0 Nano, compact and open-source models built for AI at the edge. Available in 350M and 1B, for building AI on laptops and mobile devices: ibm.co/6013Bzpt7

English
1
1
1
566
David Cox
David Cox@neurobongo·
@senthazalravi @xenovacom they do, every Granite model is built via a process and management system that has been externally audited.
English
0
0
0
20
David Cox retweetledi
Xenova
Xenova@xenovacom·
IBM just released Granite-4.0 Nano, their smallest LLMs ever (300M & 1B)! 😍 The models demonstrate remarkable instruction following and tool calling capabilities, and can even run locally in-browser! This means they can interact with websites and call browser APIs for you! 🤯
English
9
79
515
32.2K
David Cox retweetledi
Marktechpost AI Dev News ⚡
Marktechpost AI Dev News ⚡@Marktechpost·
IBM AI Team Releases Granite 4.0 Nano Series: Compact and Open-Source Small Models Built for AI at the Edge Small models are often blocked by poor instruction tuning, weak tool use formats, and missing governance. IBM AI team released Granite 4.0 Nano, a small model family that targets local and edge inference with enterprise controls and open licensing. The family includes 8 models in two sizes, 350M and about 1B, with both hybrid SSM and transformer variants, each in base and instruct. Granite 4.0 Nano series models are released under an Apache 2.0 license with native architecture support on popular runtimes like vLLM, llama.cpp, and MLX.... Full analysis: marktechpost.com/2025/10/29/ibm… Model weights: huggingface.co/collections/ib… @IBM @IBMwatsonx @IBMResearch @IBMData @IBMcloud @IBMDeveloper @IBMNews
Marktechpost AI Dev News ⚡ tweet media
English
0
8
13
1.1K
David Cox
David Cox@neurobongo·
We just dropped the "nano" versions of our Granite 4 language models (1B and 350M sizes, in both vanilla and hybrid SSM versions). I will let the Pareto curve speak for itself (full benchmark details in the link below). Try them out and let us know what you think.
David Cox tweet media
English
2
3
7
550