Nico

449 posts

Nico

Nico

@itmos__

Katılım Ocak 2025
2.4K Takip Edilen124 Takipçiler
Nico
Nico@itmos__·
@_ueaj you may get called a dumb uncultured pleb if you say this out loud, but a lot of the "western canon"/classic books are a boring chore because of this. I got nothing from reading 1984. Many ancient books are just generic hero's journeys. I'm very picky about what I read nowadays!
English
0
0
1
12
ueaj
ueaj@_ueaj·
oh I know. I took an ethics class in my undergrad as well that had an enormous non-linear influence on my philosophy. I remember all of them, and will certainly cite them. I've also watched some philosophy content on yt when I was younger, and I remember all those influences as well. I also of course live in a society downstream of their thoughts. Also, I am an ~idle thinker and very introspective. Even just small amounts of information have an outsized non-linear impact on who I am today. I'm just less confident about how much additional value I would get from reading the non-distilled source texts, considering I've likely already recovered most of their information from synthesizing snippets and life experience. This is mostly me questioning whether it's worth reading non-selected excerpts or distillations.
English
1
0
1
11
ueaj
ueaj@_ueaj·
I've started transcribing my world model, ethics, aesthetics etc in an obsidian notebook before I start reading just to see if anyone else has actually ever said anything of worth or if I was actually completely justified in not reading any books since middle school
English
1
0
2
180
Nico retweetledi
Ai2
Ai2@allen_ai·
Grounding lets vision-language models do more than describe—they can point to where a robot should grasp, which button to click, or which object to track across video frames. Today we're releasing MolmoPoint, a better way for models to point. 🧵
Ai2 tweet media
English
4
29
200
35.8K
Nico
Nico@itmos__·
@dillon_mulroy damn are we really shifting back this hard? how come everyone broke out of the spell at the same time? I think we might be overcorrecting a bit lol
English
3
0
9
3.1K
Dillon Mulroy
Dillon Mulroy@dillon_mulroy·
i think i’m back to wanting a really good tab model - any progress here outside of cursor (i don’t have access to supermaven) and for nvim?
English
51
3
300
66.7K
Nico
Nico@itmos__·
@rogeriomarquest @katech0n was about to ask the same thing. Idk how anyone could seriously believe this after reading Zhuangzi or just a few sutras. The Anattalakkhaṇa Sutta necessitates a rich inner world and a strong sense of self, since that's the very "illusion" it's trying to break down.
English
1
0
1
18
Roger Marques
Roger Marques@rogeriomarquest·
@katech0n Did you read anything written more than 1400 years ago?
English
2
0
1
863
Aditya
Aditya@katech0n·
I hate to be that guy but it is prob both true that - the ancients did not have an “inner world” in the way that we think they did - they had better lives as a result and retardmaxxing is the wrong solution to the right diagnosis Try this 10 second exercise, you’ll see what I mean. Look at your phone/laptop and answer the question “what is this object” WITHOUT using any words. You should start to feel a tingling at the back of your head, maybe even a sense of anxiety. Do this for a day* and you will understand - Plato - Genesis chapter 2 / John chapter 1 - early Wittgenstein - basic angelology - Lana del Rey - why Guenon/Evola hated fascism - why psychedelics are stupid - why Kant is a bastard who should be put on trial for the murder of philosophy and every patriot should spit on his grave There’s a lot of evidence that this is the default state of the ancients, unmediated by the post-Industrial excess of information/categories. So when Marcus Aurelius tells you he’s sad, he’s probably experiencing something very different from you and me. * heads up if do this for > 1 week you will genuinely go insane. speaking from partial experience. would not recommend
Aditya tweet media
roon@tszzl

@pmarca an entire book where the guy is introspecting

English
49
12
366
73K
Nico
Nico@itmos__·
@corsaren if you take the standalone route, lmk which text editing library you end up choosing! I tend to overthink a lot, and I got stuck choosing between Lexical, Tiptap (ProseMirror wrapper) or just ProseMirror
English
1
0
1
18
corsaren
corsaren@corsaren·
@itmos__ lol this was also one of the first questions that came to my mind when considering it. Obsidian plugin route seems potentially convenient, but I also kinda just want the full freedom to do whatever. also, what was the other thing you were thinking about?
English
2
0
1
49
corsaren
corsaren@corsaren·
I would like a writing app where I can simultaneously store 10 different iterations of the same phrase, sentence, paragraph, section, etc. and hot swap in the edit This would be useful to maybe 5 other xNTPs who also write like picky schizos, but fuck it, it’s 2026, TAM is fake
English
9
0
57
1.9K
Nico
Nico@itmos__·
@corsaren it was a while ago, can't remember the details but you were thinking about ways to have more control over context management and had super similar ideas to mine
English
0
0
1
5
Nico
Nico@itmos__·
"I care not just about all the dyson spheres that AI's gonna build in an autonomous way - I care about what happens to humans. I want humans to be well-off in this future. And I feel like that's where I can uniquely add value [more] than, like, an incremental improvement in a frontier lab"
English
0
0
0
22
sarah guo
sarah guo@saranormous·
excited to have sensei @karpathy back on @nopriorspod tomorrow. what wisdom shall we glean? (yes, yes, wth is going on with coding agents)
English
83
34
1.1K
97.3K
Nico
Nico@itmos__·
posts like these give me more anxiety and angst than any doompost. Just pure empty FOMO. They never offer any actionable advice, and they always seem to come from people who do nothing but post on twitter all day and hang out in SF parties. They never have anything to show for inspiration.
English
1
0
5
99
Nico
Nico@itmos__·
@Impish_Bunny @zeta_globin all stimulants do this, even caffeine. imo it's probably fine at low doses. I'd never recommend vaping though, no matter the dose. Makes it way too easy to overdo it and get addicted.
English
1
0
0
41
ylareia
ylareia@Impish_Bunny·
@zeta_globin nicotine!!!! smoking is ofc the worst but all nicotine
English
1
0
13
771
ylareia
ylareia@Impish_Bunny·
reading about how nicotine is one of the most potent skin agers out there..... throwing away my vape..........
English
9
0
79
6.8K
Nico retweetledi
will brown
will brown@willccbb·
you can create complex agentic environments and launch RL training runs with a single prompt. deploy trained inference endpoints with a single click. no GPUs, no SSH, no vLLM. just `prime`. guide: docs.primeintellect.ai/guides/rl-trai…
English
58
64
918
135.8K
Nico retweetledi
Chen Liang
Chen Liang@crazydonkey200·
@karpathy Very inspiring as always! We are also open sourcing part of our infra on automated research for Gemini to evolve itself at github.com/google-deepmin… More complex than the nanochat setup but closer to SOTA LLM pre/post-training while staying as minimal as possible. More on the way.
English
13
148
1.4K
101.6K
Nico retweetledi
Han Xiao
Han Xiao@hxiao·
Given an embedding vector, you can tell which model produced it. I trained a 0.8M transformer that fingerprints embedding models by reading raw float digits (vocab size: 15). Full end-to-end, zero feature engineering.
English
13
27
252
19.3K
Tim Culverhouse
Tim Culverhouse@rockorager·
What makes a language good for agents?
English
31
0
19
5.4K
Nico retweetledi
StepFun
StepFun@StepFun_ai·
"can we get the base model?" sure. here's two. "can we get the code?" sure. here's SteptronOSS. "what about the SFT data?" coming soon. maximum sincerity, minimum barriers. - Step 3.5 Flash Base — pretrained foundation - Step 3.5 Flash Base-Midtrain — code, agents & long-context - SteptronOSS — open-sourced, ready for your custom workflows - SFT Data — coming soon for reference not just the final checkpoint — a customizable pipeline. 🤗 huggingface.co/stepfun-ai/Ste… 🤗 huggingface.co/stepfun-ai/Ste… 💻 github.com/stepfun-ai/Ste…
English
33
120
1.2K
142.6K
Nico
Nico@itmos__·
@teortaxesTex we will never hear the end of it lmao
English
0
0
0
184