kobikadosh.eth

919 posts

kobikadosh.eth banner
kobikadosh.eth

kobikadosh.eth

@KobiKadosh

Code Artisan, AI First, Web 2.0 solutionist & FCTO, Web3 novice 🚶‍♂️ Ξ kobikadosh.eth ⧫ Building @CaroDaShellShib | https://t.co/hl5KBYmSAA

🇨🇦 Vancouver, BC Katılım Nisan 2011
2.7K Takip Edilen215 Takipçiler
kobikadosh.eth
kobikadosh.eth@KobiKadosh·
How fast do you type? Or better yet, dictate 🚀 Choose your dictation tool today. Don't lose your flow while typing to your LLM or sharing your daily status. wisprflow.ai/r?KOBI2
kobikadosh.eth tweet media
English
0
0
0
24
kobikadosh.eth
kobikadosh.eth@KobiKadosh·
Check out Caro - a safe shell command assistant powered by local AI! caro.sh
English
0
0
0
19
kobikadosh.eth retweetledi
Boris Cherny
Boris Cherny@bcherny·
13/ A final tip: probably the most important thing to get great results out of Claude Code -- give Claude a way to verify its work. If Claude has that feedback loop, it will 2-3x the quality of the final result. Claude tests every single change I land to claude.ai/code using the Claude Chrome extension. It opens a browser, tests the UI, and iterates until the code works and the UX feels good. Verification looks different for each domain. It might be as simple as running a bash command, or running a test suite, or testing the app in a browser or phone simulator. Make sure to invest in making this rock-solid. code.claude.com/docs/en/chrome
English
93
153
3.6K
558.3K
kobikadosh.eth retweetledi
anshuman
anshuman@athleticKoder·
She dumped me last night. Not because I don't listen. Not because I'm always on my phone. Not even because I forgot our anniversary (twice). But because, in her exact words: "You only pay attention to the parts of what I say that you think are important." I stared at her for a moment and realized... She just perfectly described the attention mechanism in transformers. Turns out I wasn't being a bad boyfriend. I was being mathematically optimal. See, in conversations (and transformers), you don't give equal weight to every word. Some words matter more for understanding context. Attention figures out exactly HOW important each word should be. Here's the beautiful math: Attention(Q, K, V) = softmax(QK^T / √d_k)V Breaking it down: Q (Query): "What am I looking for?" K (Key): "What info is available?" V (Value): "What is that info?" d_k: Key dimension (for scaling) Think library analogy: You have a question (Query). Books have titles (Keys) and content (Values). Attention finds which books are most relevant. Step-by-step with "The cat sat on the mat": Step 1: Create Q, K, VEach word → three vectors via learned matrices W_Q, W_K, W_V For "cat": Query: "What should I attend to when processing 'cat'?" Key: "I am 'cat'" Value: "Here's cat info" Step 2: Calculate scoresQK^T = how much each word should attend to others Processing "sat"? High similarity with "cat" (cats sit) and "mat" (where sitting happens). Step 3: Scale by √d_kPrevents dot products from getting too large, keeps softmax balanced. Step 4: SoftmaxConverts scores to probabilities: "cat": 0.4 (subject) "sat": 0.3 (action) "mat": 0.2 (location) "on": 0.1 (preposition) "the": 0.1 (article) Step 5: Weight valuesMultiply each word's value by attention weight, sum up. Now "sat" knows it's most related to "cat" and "mat". Multi-Head Magic:Transformers do this multiple times in parallel: Head 1: Subject-verb relationships Head 2: Spatial ("on", "in", "under") Head 3: Temporal ("before", "after") Head 4: Semantic similarity Each head learns different relationship types. Why This Changed Everything: Before: RNNs = reading with flashlight (one word at a time, forget the beginning) After: Attention = floodlights on entire sentence with dimmer switches This is why ChatGPT can: Remember 50 messages ago Know "it" refers to something specific Understand "bank" = money vs river based on context The Kicker:Models learn these patterns from data alone. Nobody programmed grammar rules. It figured out language structure just by predicting next words. Attention is how AI learned to read between the lines. Just like my therapist helped me understand my focus patterns, maybe understanding transformers helps us see how we decide what matters. Now if only I could implement multi-head attention in dating... Still waiting for "scaled dot-product listening" to be invented.
English
189
505
6.8K
699.9K
kobikadosh.eth retweetledi
Chatwoot
Chatwoot@chatwootapp·
Did you see the Chatwoot truck in SF today?
Chatwoot tweet media
English
1
6
25
2.7K
kobikadosh.eth
kobikadosh.eth@KobiKadosh·
Over the years I’ve hesitated to start investing because it felt complex or expensive. What made the difference for me was finding a platform that keeps it simple and educational while offering serious tools for anyone in Canada who wants to take investing into their own hands.
English
1
0
0
37
kobikadosh.eth
kobikadosh.eth@KobiKadosh·
🌈🌈🌈 I just had 10099 Rainbow Points dropped into my wallet — plus an extra 9 Points as a bonus for migrating my MetaMask wallet into Rainbow 🦊 🔫 Everybody has at least 100 points waiting for them, but you might have more! Claim your drop: rainbow.me/points?ref=7Z8… 🌈🌈🌈
English
1
0
2
67