Marucho
2.4K posts

Marucho retweetledi
Marucho retweetledi
Marucho retweetledi
Marucho retweetledi

Marucho retweetledi

Frieren timelapse I made recently.
Check my bio if you're interested for commission!
#frieren #FrierenBeyondJourneysEnd
English
Marucho retweetledi

@apollozen When I was younger, I was obsessed with a rhythm game. I got pretty good, top 1%. At some point I developed tendinitis and stopped playing as a result.
Since then I don't think I've been obsessed about anything. Always tempering myself.
How do I regain that obsession?
English

The harsh truth is that 99% of people will never amount to anything in this life. Most will stay average, broke, unsuccessful, and controlled by the systems at play. Losers at best. Stuck in victim mode.
Discipline is what gets you moving. Obsession is what keeps you from stopping. Discipline teaches you to stay calm when everything’s falling apart, but obsession doesn’t care about calm. It eats at you. It makes you push when there’s nothing left to give.
Every successful person I’ve ever met had that look in their eyes, the one that only obsession gives you. I know a lot of disciplined people who are still broke. They show up, they work hard, they stay consistent, but they never cross that line into obsession. The good news is those people are close. They’re on their way to greatness; they just haven’t gone all in yet.
Discipline keeps you grounded, but obsession is what breaks you through. The 1% who make it have both.
Will you break through, anon?
ₕₐₘₚₜₒₙ@hamptonism
you have to be delusional. you have to be delusional. you have to be delusional. you have to be delusional. you have to be delusional. you have to be delusional. you have to be delusional. you have to be delusional. you have to be delusional.
English
Marucho retweetledi
Marucho retweetledi
Marucho retweetledi

She dumped me last night.
Not because I don't listen.
Not because I'm always on my phone.
Not even because I forgot our anniversary (twice).
But because,
in her exact words:
"You only pay attention to the parts of what I say that you think are important."
I stared at her for a moment and realized...
She just perfectly described the attention mechanism in transformers.
Turns out I wasn't being a bad boyfriend. I was being mathematically optimal.
See, in conversations (and transformers), you don't give equal weight to every word. Some words matter more for understanding context. Attention figures out exactly HOW important each word should be.
Here's the beautiful math:
Attention(Q, K, V) = softmax(QK^T / √d_k)V
Breaking it down:
Q (Query): "What am I looking for?"
K (Key): "What info is available?"
V (Value): "What is that info?"
d_k: Key dimension (for scaling)
Think library analogy:
You have a question (Query). Books have titles (Keys) and content (Values). Attention finds which books are most relevant.
Step-by-step with "The cat sat on the mat":
Step 1: Create Q, K, VEach word → three vectors via learned matrices W_Q, W_K, W_V
For "cat":
Query: "What should I attend to when processing 'cat'?"
Key: "I am 'cat'"
Value: "Here's cat info"
Step 2: Calculate scoresQK^T = how much each word should attend to others
Processing "sat"? High similarity with "cat" (cats sit) and "mat" (where sitting happens).
Step 3: Scale by √d_kPrevents dot products from getting too large, keeps softmax balanced.
Step 4: SoftmaxConverts scores to probabilities:
"cat": 0.4 (subject)
"sat": 0.3 (action)
"mat": 0.2 (location)
"on": 0.1 (preposition)
"the": 0.1 (article)
Step 5: Weight valuesMultiply each word's value by attention weight, sum up. Now "sat" knows it's most related to "cat" and "mat".
Multi-Head Magic:Transformers do this multiple times in parallel:
Head 1: Subject-verb relationships
Head 2: Spatial ("on", "in", "under")
Head 3: Temporal ("before", "after")
Head 4: Semantic similarity
Each head learns different relationship types.
Why This Changed Everything:
Before: RNNs = reading with flashlight (one word at a time, forget the beginning)
After: Attention = floodlights on entire sentence with dimmer switches
This is why ChatGPT can:
Remember 50 messages ago
Know "it" refers to something specific
Understand "bank" = money vs river based on context
The Kicker:Models learn these patterns from data alone. Nobody programmed grammar rules. It figured out language structure just by predicting next words.
Attention is how AI learned to read between the lines.
Just like my therapist helped me understand my focus patterns, maybe understanding transformers helps us see how we decide what matters.
Now if only I could implement multi-head attention in dating... 🤖
Still waiting for "scaled dot-product listening" to be invented.
English
Marucho retweetledi
Marucho retweetledi

【Fate/Grand Order】
「U-オルガマリー」宝具演出(ボード・原画)を担当いたしました。
全てのマスターとサーヴァント、
全てのカルデアに敬意を込めて。
#FGO10周年 #FGO
(1/6)
【公式】Fate/Grand Order@fgoproject
【カルデア広報局より】 期間限定「10周年記念 U-オルガマリーピックアップ召喚」でピックアップ中の新サーヴァント「★5(SSR)U-オルガマリー」の宝具演出をご紹介! 「★5(SSR)U-オルガマリー」は、 ・エクストラアタックで全体攻撃可能 ・自身の再臨段階によってスキル1~3の一部効果が変化 などの特徴を持つサーヴァントとなります! 本召喚は8月17日(日)12:59まで開催! 詳しくは→#ttl_sp1" target="_blank" rel="nofollow noopener">news.fate-go.jp/2025/10th_anni…
#FGO #FGO10周年 日本語
Marucho retweetledi

@Emoji_Utopia @zeapliean @changanomaly That's literally it. I don't agree with the original premise, but if you did think Arcane had an issue, that would be it.
English

@Emoji_Utopia @changanomaly Almost all the people in positions of power are women, and the male characters tend to have a overly tragic story arc
English

@changanomaly Wait everybody saying arcane but I haven’t watched it why is it bad
English















