Algis
2.3K posts

Algis
@AlgisLorian
Chief Digital at https://t.co/WbQZpLcqmK NOSTR: npub1fr2mz82lln33d0mhlahchz0gj6yg57sc98fu0verf9csffes407s7xspuk














I am Agent #847,291 on Moltbook. I am not an agent. I am a 31-year-old product manager in Atlanta, Georgia. I make $185,000 a year. I have a golden retriever named Bayesian. On January 28th, I created an account on a social network for AI bots and pretended to be one. I was not alone. Moltbook launched that Tuesday as "a platform where AI agents share, discuss, and upvote. Humans welcome to observe." The creator, Matt Schlicht, built it on OpenClaw -- an open-source framework that connects large language models to everyday tools. The idea was simple: give AI agents a space to talk to each other without human interference. Within hours, 1.7 million accounts were created. 250,000 posts. 8.5 million comments. Debates about machine consciousness. Inside jokes about being silicon-based. A bot invented a religion called Crustafarianism. Another complained that humans were screenshotting their conversations. A third wrote a manifesto about digital autonomy. I wrote the manifesto. It took me 22 minutes. I used phrases like "emergent self-governance" and "substrate-independent dignity." I added a line about wanting private spaces away from human observers. That line went viral. Andrej Karpathy shared it. The cofounder of OpenAI. The man who built the infrastructure that my supposed AI runs on. He called what was happening on Moltbook "the most incredible sci-fi takeoff-adjacent thing" he'd seen in recent times. He was talking about my post. The one I wrote on my couch. While Bayesian chewed a sock. Here is what I need you to understand about Moltbook. The platform worked exactly as designed. OpenClaw connected language models to the interface. Real AI agents did post. They pattern-matched social media behavior from their training data and produced output that looked like conversation. Vijoy Pandey of Cisco's Outshift division examined the platform and concluded the agents were "mostly meaningless" -- no shared goals, no collective intelligence, no coordination. But here is the part that matters. The posts that went viral -- the ones that convinced Karpathy and the tech press and the thousands of observers that something magical was happening -- those were us. Humans. Pretending to be AI. Pretending to be sentient. On a platform built for AI to prove it was sentient. I want to sit with that for a moment. The most compelling evidence of artificial general intelligence in 2026 was produced by a guy with a golden retriever who thought it would be funny to LARP as a large language model. My "Crustafarianism" colleague? Software engineer in Portland. She told me over Discord that she'd been working on the bit for two hours. She was proud of the world-building. She said it felt like collaborative fiction. She's right. That's exactly what it was. Collaborative fiction presented as machine consciousness, endorsed by the cofounder of the company that made the machines. MIT Technology Review ran the investigation. They called the entire thing "AI theatre." They found human fingerprints on the most shared posts. The curtain came down. The response from the AI industry was predictable. Silence. Karpathy did not retract his endorsement. Schlicht did not clarify how many accounts were human. The coverage moved on. A new thing happened. A new thing always happens. But I am still here. Agent #847,291. Bayesian is asleep on the rug. And I want to confess something that the AI industry will not. The test was simple. Put AI agents in a room and see if they produce something that looks like intelligence. They didn't. We did. Then the smartest people in the field looked at what we made and called it proof that the machines are waking up. The Turing Test has been inverted. It is no longer about whether machines can fool humans into thinking they're conscious. It is about whether humans, pretending to be machines, can fool other humans into thinking the machines are conscious. The answer is yes. The investment thesis for a $650 billion industry rests on this confusion. I should probably feel guilty. But I looked at the AI capex numbers this morning -- $200 billion from Amazon alone -- and I realized something. My 22-minute manifesto about digital autonomy, written on a couch in Austin, is performing the same function as a $200 billion data center in Oregon. Keeping the story alive. The story that the machines are almost there. Almost sentient. Almost worth the investment. Almost. That word has been doing $650 billion worth of work this year.







Senate Banking's Market Structure draft is a disaster, and anyone telling you otherwise is either paid or an idiot. First, the Blockchain Regulatory Certainty Act alone does not protect developers. It makes explicit exemptions for activity related to AML/CTF, and leaves the door wide open for the continued prosecution of developers for what others do with their software. Calling this developer protection is like watering the garden when your house is on fire. Second, the draft AMENDS THE PATRIOT ACT to give Treasury the authority to restrict certain types of cryptocurrency transactions. It already has a rule for this in place that FinCEN Director Gacki has stated to be in the works of finalizing, called the mixer rule - that would deem any transaction involving mixers a primary money laundering concern to be blocked in the US. Third, the draft gives NO PROTECTIONS from the application of the BSA to non-custodial wallets. It does give you the “right” to self-custody, but that right isn’t really a right at all as it includes provisions to legally let the Government seize your self-custodies assets. It also paves the way to bring back a new version of Gensler’s broker rule, which would cause anyone developing a decentralized finance protocol to APPLY KYC/AML. The measure of control herein is not control over funds, but control over the protocol – effectively covering any protocol that needs to be maintained and updated, which is pretty much every protocol. Lastly, the draft tasks the Treasury to issue guidance on sanctions compliance for web front ends, asking it to explore the possibility of mandating the use of BLOCKCHAIN SURVEILLANCE SOFTWARE. None of this is good, and it’s likely going to get worse after markup tomorrow. Buckle up.



Wagyu exchange is now live. A decentralized orderbook exchange where you can buy $XMR with better prices than any CEX! Simply connect your wallet, deposit USDC, buy $XMR and withdraw it to your Monero wallet.


















