jessie@Fat-Garage 🦇🔊

3.2K posts

jessie@Fat-Garage 🦇🔊 banner
jessie@Fat-Garage 🦇🔊

jessie@Fat-Garage 🦇🔊

@JESSCATE93

https://t.co/lkQuFy0vkJ(公众号:胖车库) / student of intelligence / Bitcoin / roam / activate for life 🏃‍♀️

somewhr انضم Nisan 2016
1.3K يتبع938 المتابعون
تغريدة مثبتة
jessie@Fat-Garage 🦇🔊
jessie@Fat-Garage 🦇🔊@JESSCATE93·
My life goal is to bridge tools for thoughts and the blockchain, the purpose is to make human thoughts valuable. but there is something missing between people's thoughts and the pure blockchain,I m always thinking about this, what is it? first, let's talk about the roles of each.
English
13
9
63
0
James
James@Jamesward_eth·
@JESSCATE93 Hi jessie your DMs closed
English
1
0
0
17
jessie@Fat-Garage 🦇🔊
jessie@Fat-Garage 🦇🔊@JESSCATE93·
My life goal is to bridge tools for thoughts and the blockchain, the purpose is to make human thoughts valuable. but there is something missing between people's thoughts and the pure blockchain,I m always thinking about this, what is it? first, let's talk about the roles of each.
English
13
9
63
0
James
James@Jamesward_eth·
@JESSCATE93 Hey jessie Im looking to make a deal for your ens 'conaw.eth'. can you follow me, please? thanks a lot!
English
1
0
0
15
jessie@Fat-Garage 🦇🔊 أُعيد تغريده
Andrej Karpathy
Andrej Karpathy@karpathy·
New 3h31m video on YouTube: "Deep Dive into LLMs like ChatGPT" This is a general audience deep dive into the Large Language Model (LLM) AI technology that powers ChatGPT and related products. It is covers the full training stack of how the models are developed, along with mental models of how to think about their "psychology", and how to get the best use them in practical applications. We cover all the major stages: 1. pretraining: data, tokenization, Transformer neural network I/O and internals, inference, GPT-2 training example, Llama 3.1 base inference examples 2. supervised finetuning: conversations data, "LLM Psychology": hallucinations, tool use, knowledge/working memory, knowledge of self, models need tokens to think, spelling, jagged intelligence 3. reinforcement learning: practice makes perfect, DeepSeek-R1, AlphaGo, RLHF. I designed this video for the "general audience" track of my videos, which I believe are accessible to most people, even without technical background. It should give you an intuitive understanding of the full training pipeline of LLMs like ChatGPT, with many examples along the way, and maybe some ways of thinking around current capabilities, where we are, and what's coming. (Also, I have one "Intro to LLMs" video already from ~year ago, but that is just a re-recording of a random talk, so I wanted to loop around and do a lot more comprehensive version of this topic. They can still be combined, as the talk goes a lot deeper into other topics, e.g. LLM OS and LLM Security) Hope it's fun & useful! youtube.com/watch?v=7xTGNN…
YouTube video
YouTube
Andrej Karpathy tweet media
English
770
2.9K
20.2K
2.4M
jessie@Fat-Garage 🦇🔊 أُعيد تغريده
Felix Hill
Felix Hill@FelixHill84·
The more I research, the more I'm convinced that Wittgenstein, Eleanor Rosch and David Rumelhart could have, collectively, designed the Transformer. 🧵
English
9
28
291
55.9K
jessie@Fat-Garage 🦇🔊 أُعيد تغريده
🌿 lithos
🌿 lithos@lithos_graphein·
Everyone associates ASML, a relatively obscure Dutch company, with EUV Lithography, but here's a good list of the other companies also supporting this emerging market that made the AI era possible.
🌿 lithos tweet media
English
12
98
420
159K
jessie@Fat-Garage 🦇🔊
jessie@Fat-Garage 🦇🔊@JESSCATE93·
Hinton G E. How neural networks learn from experience[M]. na, 1992.
jessie@Fat-Garage 🦇🔊 tweet media
English
0
1
2
272
jessie@Fat-Garage 🦇🔊
jessie@Fat-Garage 🦇🔊@JESSCATE93·
one of the most wonderful contributions of languageAI is to transform the jargon into cats and dogs
jessie@Fat-Garage 🦇🔊 tweet mediajessie@Fat-Garage 🦇🔊 tweet media
English
0
0
0
77
juand4bot
juand4bot@juand4bot·
I’ve been creating startups and trying things for about 7 years with no success, and I feel frustrated. But I'll keep going. It hurts, but I fear the pain of not trying even more. Maybe I haven't been consistent enough. So no plan B, just improve.
juand4bot tweet media
English
3
0
2
283