고정된 트윗
Uncle OJ
33.9K posts

Uncle OJ
@uncooloj
✈️ | Building boring things that work on the internet… 🚢
가입일 Eylül 2011
6.6K 팔로잉6.3K 팔로워
Uncle OJ 리트윗함
Uncle OJ 리트윗함
Uncle OJ 리트윗함
Uncle OJ 리트윗함
Uncle OJ 리트윗함
Uncle OJ 리트윗함
Uncle OJ 리트윗함
Uncle OJ 리트윗함
Uncle OJ 리트윗함

@uncooloj @cursor_ai So I had a long running task that was supposed to look at a bunch existing frontend code and create a new UX for another feature taking elements from them.
Codex did really sloppy work, but composer actually figured things out.
code quality is better with codex sha
English

@cursor_ai composer 1.5 is surprisingly really good.
It’s done better than the gpt codex model at some tasks and I’m actually wowed
English
Uncle OJ 리트윗함
Uncle OJ 리트윗함
Uncle OJ 리트윗함
Uncle OJ 리트윗함

Not an easy thing to do, sharing your personal story but quite inspiring for others. This is why I love Micah, he's genuine with it. Props to Henry for giving him the space to do so.
CBS Sports Golazo ⚽️@CBSSportsGolazo
A really honest and vulnerable moment from the #UCLToday crew to open today's show 🥹❤️
English
Uncle OJ 리트윗함

Uncle OJ 리트윗함
Uncle OJ 리트윗함

codex has gotten very good!
Garry Tan@garrytan
OK Codex is GOAT at finding bugs and finding plan errors
English
Uncle OJ 리트윗함

GPT-5.4 mini matters for subagents because it changes what feels worth handing off.
The parent thread should hold the architecture, plan, and progress narrative.
Fast subagents can explore the repo, check hypotheses, and preserve the parent thread’s limited attention.
OpenAI Developers@OpenAIDevs
We’re introducing GPT-5.4 mini and nano, our most capable small models yet. GPT-5.4 mini is more than 2x faster than GPT-5 mini. Optimized for coding, computer use, multimodal understanding, and subagents. For lighter-weight tasks, GPT-5.4 nano is our smallest and cheapest version of GPT-5.4. openai.com/index/introduc…
English





















