Will Carkner
253 posts

Will Carkner
@CarknerWill
curiously optimistic abundance engineer
Dublin, Ireland Sumali Kasım 2021
543 Sinusundan232 Mga Tagasunod
Will Carkner nag-retweet

HUGE session last night for the 55th edition of demosanon.com. 70+ ppl!
- @CarknerWill & Johnny talked about PAT testing & developing custom hardware for office power strips
- Rathe demo'd how Gemell procedurally generates photorealistic digital twins of textiles directly from production data & CAD files
- Leon & Charlie talked about using in-room radar and edge-ML to detect falls and long-lies in care homes
thank you to @AkashBajwa96 from @EarlybirdVC for making this session possible 🙌
if you'd like to join us next time, dm me :)
@adityajoi 👊 on to the next one




English

performative vibecoding on a €13 @Ryanair flight
if the landing is hard I'm pushing straight to prod
@cursor_ai

English
Will Carkner nag-retweet

Excited to release new repo: nanochat!
(it's among the most unhinged I've written).
Unlike my earlier similar repo nanoGPT which only covered pretraining, nanochat is a minimal, from scratch, full-stack training/inference pipeline of a simple ChatGPT clone in a single, dependency-minimal codebase. You boot up a cloud GPU box, run a single script and in as little as 4 hours later you can talk to your own LLM in a ChatGPT-like web UI.
It weighs ~8,000 lines of imo quite clean code to:
- Train the tokenizer using a new Rust implementation
- Pretrain a Transformer LLM on FineWeb, evaluate CORE score across a number of metrics
- Midtrain on user-assistant conversations from SmolTalk, multiple choice questions, tool use.
- SFT, evaluate the chat model on world knowledge multiple choice (ARC-E/C, MMLU), math (GSM8K), code (HumanEval)
- RL the model optionally on GSM8K with "GRPO"
- Efficient inference the model in an Engine with KV cache, simple prefill/decode, tool use (Python interpreter in a lightweight sandbox), talk to it over CLI or ChatGPT-like WebUI.
- Write a single markdown report card, summarizing and gamifying the whole thing.
Even for as low as ~$100 in cost (~4 hours on an 8XH100 node), you can train a little ChatGPT clone that you can kind of talk to, and which can write stories/poems, answer simple questions. About ~12 hours surpasses GPT-2 CORE metric. As you further scale up towards ~$1000 (~41.6 hours of training), it quickly becomes a lot more coherent and can solve simple math/code problems and take multiple choice tests. E.g. a depth 30 model trained for 24 hours (this is about equal to FLOPs of GPT-3 Small 125M and 1/1000th of GPT-3) gets into 40s on MMLU and 70s on ARC-Easy, 20s on GSM8K, etc.
My goal is to get the full "strong baseline" stack into one cohesive, minimal, readable, hackable, maximally forkable repo. nanochat will be the capstone project of LLM101n (which is still being developed). I think it also has potential to grow into a research harness, or a benchmark, similar to nanoGPT before it. It is by no means finished, tuned or optimized (actually I think there's likely quite a bit of low-hanging fruit), but I think it's at a place where the overall skeleton is ok enough that it can go up on GitHub where all the parts of it can be improved.
Link to repo and a detailed walkthrough of the nanochat speedrun is in the reply.

English

the bottom line: if you believe in more electricity, you must believe in better wires and smarter grids. transmission is the underrated chokepoint.
for some more in-depth thoughts, check out my recent post at willcarkner.com/blog/energy
English

@rath_core @notBrunoG the ELECTRICITY AGE is here. we need better storage and distribution, and special stuff is happening at Lumindt.
if what Lumindt is doing sounds interesting, i’d encourage checking them out. it was a super fun place to work and taught me loads.
lumindt.com
English

a few photos of my work:
couldn’t have been possible without the team at Lumindt, s/o @rath_core and @notBrunoG for taking a chance on me



English

@jackoregankenny @GodelTerminal @openbb_finance tried godel terminal. very good but initial config with a new account could be improved
English

I’ve used a Bloomberg terminal in college maybe 3/4 times and the experience of both @GodelTerminal and @openbb_finance is a lot nicer to the dumb user
English




