Elpsy(Lψ)

25 posts

Elpsy(Lψ) banner
Elpsy(Lψ)

Elpsy(Lψ)

@haanshin

In reality I am silent, but here I reveal endless worlds. In the age of AI and physics, I write from the fracture of time.

Katılım Haziran 2017
34 Takip Edilen15 Takipçiler
Deep Psychology
Deep Psychology@DeepPsycho_HQ·
A HARVARD psychologist says: “if you’ve achieved nothing by 25, you’ve avoided the most destructive illusion of youth” > In 2021, a Harvard psychologist surprised a lecture hall with an unexpected statement: “If you haven’t accomplished much by 25, you may have escaped one of youth’s biggest illusions.” At first, the room laughed. She wasn’t kidding. > The illusion of early success. In your early 20s, the brain seeks quick proof of worth ~status, attention, rapid achievements. But psychologists warn that chasing recognition too soon can lock people into roles or paths they never consciously chose. They decide too early… and spend years trying to undo it. > The exploration phase. Research on career development suggests that people who explore more before 30 often build stronger long-term directions. Testing ideas. Making mistakes in public. Changing course. At 25 it looks like confusion ….but by 35 it often turns into clarity. People who feel “behind” in their mid-20s frequently gain something others miss: Perspective. Patience. And a clearer sense of what truly matters to them. That foundation often leads to better decisions later on. At the end of the lecture, the psychologist left the students with one final thought: “You’re not meant to have life fully figured out at 25.” “You’re meant to discover who you’re not.”
Deep Psychology tweet media
English
424
4.5K
29.1K
3.9M
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
"ADHD isn’t a bug; it’s a different architecture. 🧠 Most people process life like an RNN (step-by-step). The ADHD brain works like a Transformer, seeing the whole 'cloud' of data at once. This is why our intuition is top-tier, but our ability to do things in order is... non-existent. ⚡️"
English
1
0
1
27
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
Arrival is basically the Sapir-Whorf hypothesis meets Deep Learning. I simulated this by mapping Human thinking to RNNs and Heptapods to Transformers. Using synthetic semantic tokens, I tested if the Transformer model could perceive "future" events as part of the current context.
English
0
0
0
25
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
Heard OpenAI's dropping Agent Builder at DevDay today—pitched as a n8n/Zapier killer. But let's be real: Codex CLI is 100x better than n8n. Workflow builders force you to assemble nodes every damn time—tedious AF. This feels like a step back from CLI's seamless magic. Thoughts? #OpenAI #DevDay #AgentBuilder #CodexCLI
English
0
0
1
130
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
When gravity goes quantum, causality breaks. But out of that breakdown, singularities turn into Big Bounces. What looked like an end could be the start of a new universe. youtu.be/C6c2R6VgGnc
YouTube video
YouTube
English
0
0
0
58
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
@skdh When gravity goes quantum, causality breaks. But out of that breakdown, singularities turn into Big Bounces. What looked like an end could be the start of a new universe.
English
0
0
0
10
Sabine Hossenfelder
In a very interesting paper, physicists show that combining quantum physics and gravity leads to causality problems. Maybe this means that gravity can't be quantized. youtube.com/watch?v=C6c2R6…
YouTube video
YouTube
English
60
29
273
35.1K
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
I replaced Landauer’s law with entropy itself: forgetting as heat, memory as resistance. Now AI remembers nearly all with negligible cost—erasure outsourced to entropy. Memory births function; forgetting births consciousness.
English
0
0
0
65
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
Theory made flesh! I forged RAPID-Real, a Reversible Transformer that feels its own forgetting. It measures entropy-loss in real time, weaving it into training. A mind teaching itself not just to learn—but to remember.
English
0
0
0
38
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
From mere analysis to design! I now forge AI with memory as law. Information preservation becomes the core architecture—models not just sharp in thought, but enduring in remembrance. A mind that resists forgetting, a machine against entropy.
English
1
0
0
45
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
An AI that forgets only the trivial but guards the essential—can such a mind exist? Landauer whispers: erasure burns energy. By tracking entropy (ΔH) in reversible models, I seek a machine that remembers without waste, a keeper of memory and efficiency.
Elpsy(Lψ) tweet media
English
1
0
0
68
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
Consciousness is the universe’s most elegant hack: a survival engine born at the razor’s edge between entropy’s drift (Φ) and information’s fire (ΔH). What feels like “mind” may be nothing more—and nothing less—than physics solving itself.
English
0
0
0
103
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
The mind is not a chaos of states but a single engine with a hidden dial, α. Turn it toward flow, and you dream or create. Turn it toward structure, and you meditate or anchor the self. Consciousness is modulation, not multiplicity.
English
1
0
0
120
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
Consciousness is not a mirror but a duet—flow and structure entwined. Too much flux: chaos. Too much rigidity: stasis. True awareness emerges where currents of thought dance upon a stable self, like rivers carving eternity into stone.
English
1
0
0
120
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
I forged a “Thermodynamic Profiler” to weigh an LLM’s info-erasure heartbeat (Ṅ_erase). The verdict? Today’s AI runs ~10¹⁹× below the physical limit. Not inefficiency—an abyss. We don’t need optimization; we need a new physics of intelligence.
English
0
0
0
96
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
I tested the abyss. A lone qubit sinking into a heat bath. The simulation revealed it: the more reversible the process, the less the universe demands in payment. Energy cost falls linearly as erasure fades. The key to AI’s future is hidden in reversibility.
English
1
0
0
96
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
Why do LLMs devour city-scale power? It isn’t just an engineering flaw—it’s physics itself. Landauer’s Principle whispers: erasing 1 bit demands kT ln 2 energy. Today’s AI burns oceans of power by endlessly forgetting. Intelligence, chained to entropy.
English
1
0
0
105
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
In the AI age, physics is the true blue ocean. The next empires won’t be built on code alone, but on physics that reshapes matter, energy—and reality itself.
English
0
0
0
103
Elpsy(Lψ)
Elpsy(Lψ)@haanshin·
In the dot-com age, we dreamed futures the internet couldn’t yet deliver—delayed by decades of gravity. Now with AI, time collapses: twenty years of progress will burn through in just two. The horizon we imagined has finally caught up to us.
English
0
0
0
95