Colin Wright

66 posts

Colin Wright banner
Colin Wright

Colin Wright

@CWrightswold

home.

Southwold, England Katılım Haziran 2012
96 Takip Edilen2 Takipçiler
Colin Wright
Colin Wright@CWrightswold·
@NotebookLM Jesus how about letting us multi selects and delete? Or download sources so they can be reused in another notebook?
English
1
0
3
235
matt palmer
matt palmer@mattyp·
vlog 001 - why I left replit
English
48
2
453
42.8K
Paul Graham
Paul Graham@paulg·
Stockholm is remarkably walkable. At one point we were walking somewhere and we needed to check a map. It was such a relief not to have to think about the phone being stolen. In London we always duck into a doorway before checking a phone on the street.
English
231
70
3.4K
425.3K
Olivia Moore
Olivia Moore@omooretweets·
@mil000 I missed the part where you said what your job was…or maybe spending all day on X attempting to dunk on people who are actually doing things counts
English
4
0
59
6K
Milo Smith
Milo Smith@mil000·
just remember, these are the A16Z partners closing major deals. The people who have emotional relationships to an OpenClaw
English
17
39
1.1K
79.6K
Colin Wright
Colin Wright@CWrightswold·
@emollick This has been the issue with politics (and life in general) for the last, what, 20 years? X is poisonous is so many ways
English
0
0
0
137
Ethan Mollick
Ethan Mollick@emollick·
Generally, I would say X is not real life, but I am surprised about how often I get asked by executives about which AI lab is winning or what is up with a particular model in ways that indicate that they clearly come from X discussions & rumors (often filtered through LinkedIn)
English
36
10
266
20.2K
mike
mike@Michmue2·
@BrianNorgard apple will obviously soon bring their own improved version. Also, you can run your own local version of whisper flow at zero cost, just fyi
English
1
0
1
1.8K
Norgard
Norgard@BrianNorgard·
This one screen has completely destroyed my WisprFlow user experience.
Norgard tweet media
English
97
7
603
173K
Colin Wright
Colin Wright@CWrightswold·
@RobertJBye The way you have to go thru 4+ clicks to get to projects is a bit annoying
English
0
0
0
21
Robert Bye
Robert Bye@RobertJBye·
I’ve started reserving Friday afternoons to fix little UI/UX things that bug me in the Claude mobile app - like allowing people to customise shortcuts in our widget. LMK what bugs, or UI gripes you have and I’ll try to fix them!
English
28
2
83
7.1K
Colin Wright
Colin Wright@CWrightswold·
@Teknium Will they be switchable from the dashboard, or do I still have to do it within terminal? Can we switch provider and model, or just model?
English
0
0
1
24
Teknium 🪽
Teknium 🪽@Teknium·
dashboard getting an upgrade for setting main agent and auxiliary models tnite ;)
Teknium 🪽 tweet media
English
37
15
405
10.3K
Josh Woodward
Josh Woodward@joshwoodward·
New in Gemini: Generate files and export them Tell Gemini what you want to create and the format, and it now does the work for you. Now supporting: 📄 Google Docs, Word (.docx) & PDFs 📊 Google Sheets, Excel (.xlsx) & CSV 🖥️ Google Slides 🛠️ Markdown, LaTeX, TXT, RTF Available now on all surfaces globally!
English
151
275
3K
330.1K
Colin Wright
Colin Wright@CWrightswold·
@BearNotesApp Have you built in any hashtag level inclusion or exclusion logic? There may be notes that I'm happy to include (for example, professional stuff), but other, more personal stuff I might not want to.
English
1
0
1
80
Bear - Markdown Notes
Bear - Markdown Notes@BearNotesApp·
@CWrightswold Unfortunately we can't control how third party tools work, but we have made a lot of efforts to prevent such thing happening. That being said, we still highly recommend making regular backups. At the same time we've noted down your request for note versioning. Thank you! ❤️
English
1
0
1
492
Bear - Markdown Notes
Bear - Markdown Notes@BearNotesApp·
#update Good news! Bear now has an official CLI, Claude Connector, and MCP Server. Build automations, connect your favorite AI tools, or just ask Claude what kind of person your notes say you are. The door is now open. Read the full story: blog.bear.app/2026/04/bear-2…
English
19
20
253
24.1K
Colin Wright
Colin Wright@CWrightswold·
@theo Chill out, you big baby. It happens within minutes.
English
0
0
0
362
Theo - t3.gg
Theo - t3.gg@theo·
In order to use LM Studio with my local models on my local network, I have to apply to use "LM Link" and hope they get me off the waitlist? Are you joking?
Theo - t3.gg tweet media
English
102
8
850
92.9K
Nous Research
Nous Research@NousResearch·
Nous Portal offers everything you need to build with Hermes Agent in one easy subscription: → 300+ models from every frontier lab → Free models and discounts exclusive to the Portal → Bundled tools: web search, scraping, image gen, browser use, code execution, & voice portal.nousresearch.com/manage-subscri…
Nous Research tweet media
English
56
54
824
558.5K
Colin Wright
Colin Wright@CWrightswold·
@joshwoodward Would be great if we could download the sources and also multi delete.
English
0
0
0
13
Colin Wright
Colin Wright@CWrightswold·
@felixrieseberg Hi Felix, it would be great if we could switch models mid-conversation in cowork.
English
0
0
0
76
Felix Rieseberg
Felix Rieseberg@felixrieseberg·
Tiny feature, also a bit of a nerdy one: We've added deep links so you can pre-fill new sessions in Chat, Cowork, or Code with a claude:// link. I like using it for composition in my workflows or adding "Open in Claude" buttons to other apps. It does require the desktop app. support.claude.com/en/articles/14…
English
17
11
163
13.7K
Colin Wright
Colin Wright@CWrightswold·
@trq212 Hey - will the macOS version of CC ever get feature parity with terminal or is that not technically possible?
English
0
0
1
72
Colin Wright
Colin Wright@CWrightswold·
@gkisokay It doesn't tell you what models will work for a given computer spec, will it though? It just seems to be an element of guesswork still involved.
English
0
0
0
34
Graeme
Graeme@gkisokay·
@CWrightswold Sure will do. You can look on HuggingFace what models you can run too
English
1
0
0
717
Graeme
Graeme@gkisokay·
The Local LLM cheat sheet for your 16GB RAM device I pulled together a lineup of small models that can run comfortably on a Mac Mini or personal laptop while still leaving room for context without melting your machine. Models for Daily Use Qwen3.5 9B / GGUF / Q4_K_M Daily driver. General chat, drafting, research, translation. If you're keeping only one, keep this. DeepSeek-R1 Distill Qwen 7B / GGUF / Q4_K_M Reasoning engine. Math, logic, step-by-step problems. Slower, but worth it when you need actual thinking. Models for Specialty Work Qwen2.5 Coder 7B / GGUF / Q4_K_M Code specialist. Completions, refactors, debugging, repo Q&A. Better than a generalist when the task is code. Llama 3.1 8B / GGUF / Q4_K_M Long context worker. RAG, doc chat, codebase Q and A. The output isn't top tier, but the context is strong for its size. Phi-4 Mini Reasoning / GGUF / Q4_K_M Compact thinker. Logic, structured answers, math, and short coding bursts. Smaller context is the catch. Models for Efficiency Gemma 4 E4B / GGUF / Q4_K_M Light all-rounder. Writing, chat, light agents, structured output. Phi-3.5 Mini / GGUF / Q5_K_M Pocket sidekick. Summaries, extraction, background doc chat. Easy to pair with a bigger model. Qwen3.5 2B / GGUF / Q4_K_M Useful for summaries, tagging, rewrites, and lightweight sidekick work. Micro Models Qwen3.5 0.8B / GGUF / Q5_K_M Classification, keyword routing, binary decisions, triage. Gemma 4 E2B-it / GGUF / Q4_K_M Lightweight chat, quick Q and A, summaries, tiny agents. My personal choice for a single model is Qwen3.5 9B For two models use Qwen3.5 9B + Qwen2.5 Coder 7B for code, or Qwen3.5 9B + Phi-3.5 Mini for support tasks. Let me know in the comments your experience with these models, or any I have left out.
Graeme tweet media
English
98
344
2.3K
411.4K