teaconf

233 posts

teaconf banner
teaconf

teaconf

@teaconf

Triple Entry Accounting Conference 2026 · Malta “A New Accounting Paradigm for the Information Age” A https://t.co/X4jfAg1xSd event

Malta Inscrit le Şubat 2023
170 Abonnements113 Abonnés
teaconf
teaconf@teaconf·
wishing @ProfSteveKeen a great 73rd birthday celebration & many more years of voluminous insightful productivity in Real world Economics! 🍾🎉🖖👏🙏
English
0
0
0
12
teaconf retweeté
Dr. Steve Keen
Dr. Steve Keen@ProfSteveKeen·
9/ P.S. If you want to build your own economic radar instead of trusting people who never saw 2008 coming, this is the workshop for that. Closes tonight: learn.stevekeen.com
English
0
3
12
1.6K
teaconf
teaconf@teaconf·
Congratulations to @CostaSga and Massimiliano Ferrara on their latest publication. A timely contribution on dataset integrity, AI deterioration, and the role of blockchain and #tripleentryaccounting in building more trustworthy AI systems.
Konstantinos Sgantzos@CostaSga

We are pleased to announce the publication of our new article with Prof. Massimiliano Ferrara, entitled: “Mitigating Big Data Pollution and AI Model Deterioration: A Dataset Core Approach with Blockchain-Based Verification” Article is freely available at: lnkd.in/gQCt7Xc4

English
0
0
2
13
teaconf retweeté
Exponential Science
Exponential Science@Exponential_Sci·
Exponential Science's Juan Ignacio Ibañez will speak at the @xrpl_commons Meetup London on institutional DeFi and real-world use cases on the XRP Ledger. Exponential Science and @MiCA_Alliance will share a regulatory perspective alongside Ripple and VS1. 📍18 Feb | London Register: luma.com/xshnm19t
Exponential Science tweet media
English
1
5
13
311
teaconf retweeté
iang
iang@iang_fc·
The Message In a recent reprisal of an old debate, @VitalikButerin said [1] in x.com/vitalikbuterin… : > I no longer agree with this previous tweet of mine - since 2017, I have become a much more willing connoisseur of mountains. It's worth explaining why. > > The idea of average users personally validating the entire history of the system is a weird mountain man fantasy. There, I said it. > First, the original context. That tweet was in a debate with Ian Grigg, who argued that blockchains should track the order of transactions, but not the state (eg. user balances, smart contract code and storage): > > [Iang] The messages are logged, but the state (e.g., UTXO) is implied, which means it is constructed by the computer internally, and then (can be) thrown away. This is a difficult topic, it seems, and I’ve been all over the world on it. I’m no prof of CS, so can only explain anecdotally. For the more serious, this is the work of Lamport, and the topic of state machines, databases, protocols and The Two Generals Problem. It’s also the foundation of blockchain, if you need a hint. My story starts in Australia during the 1980s rise of Unix. I was a systems programmer which meant C, shell, and the like. At the time, riding the wave of Unix, being the operating system for all new bigger computers and many old and/or small ones, porting was the name of the game. I ported Informix and Uniplex to many new Unix machines. Informix can be seen as an early competitor to Oracle, better for a while until it faltered. No matter, I got good at it, I got known for it. And then the calls started. Several times I had to drop everything and go driving or flying to some customer who’s database was sqwarked bad. And fix it. In each case, the DB had given up on the situation. The backups, if they existed, hadn’t worked. Backups weren’t a big thing in the database world, it seemed. I thought a lot about this, and came to the conclusion that the solution was to get rid of the problem - make the backups an integral part of the system (which it wasn’t in any known software, backups were an afterthought at best in those days), and, to prove the backups on a regular basis. That is, the backups had to be the data, and the data had to be the backups. I got my chance to prove it in 1995 in Amsterdam when our fledgling digital cash system started dropping payments down the cracks. On examination, it turned out that Gary Howland [2] had implemented single entry accounting, when he needed double entry. Not using double entry bookkeeping was a common failure mode - nobody much in the emerging financial cryptography scene know what accounting was about, and I only knew it by an accident of fate: computers couldn’t work with single entry, they needed full double entry accounting to handle the errors and crashes that occured at unfortunate moments. Which later led on to triple entry accounting, but I digress; the number of entries might be an entirely other story. Or might not be, come to Malta to find out [3]. Back to 1995. I volunteered to write the digital cash backend that we needed - a proper double entry accounting database. And at the same time [4], put my theories into practice - no backups! Or, maybe better said, all backups. It took a month of perl hacking, half time, the other half on b-school in London. The system faithfully recorded every PGP-signed message coming in onto an append-only file. Only once safely recorded, the machine acted to perform the message, which in the case of digital cash was a double entry transfer of value from one account to another, where accounts were nyms indentified by PGP keys. Anytime there was any trouble, you just killed it hard, which is a “kill -9” in Unix lingo. And then restarted it. It would read the entire log from genesis, as it is known now, to the current time, and re-enact all message performances. Then open the doors for new messages. What’s going on here? Well, this pattern is actually emulating the notion of state machines. The messages are inputs into the machine, and the machine operates, and perhaps sends some messages out (TEA, t’other story!). But, this state machine always records its messages in sequence onto an append-only log, and happily starts again from the beginning, any time the operator wants, and cheerily recovers its current state. It’s a persistent and restartable state machine! Back to Vitalik’s mesage at top. In his terms, I argue that “the blockchain should track the order of the transactions!” Yes, it should, and indeed store them, in that order, in order to recover state. “But not the state!” Exactly. The state is calculated on demand, as is everything else in the quote above. We’re on the same page, at least. Back to my experience, or my anecdote. The design worked. It worked so well it only ever barfed once in 20 years, a Vienna experience I wrote up in The Twilight Zone financialcryptography.com/mt/archives/00… No matter the scary title and experience, such disaster is very rare in transactional software. And so it should be - the world runs on accounting and accounting is worthless if it ever loses a penny. And if accounting is worthless we’re in a very dark place; If you’re not sure about that, read up on the history of reading, writing, arithmetic, trade, city-states, war, tax and anything in between - they all started with humble accounting. You are reading this, because first there was accounting, and only then there was writing and reading! And then came Bitcoin. It used the same design, but in a different place. Through the groundbreaking mechanism of Proof of Work (PoW), paid for by the coin of the new realm Bitcoin, it solved an open entry, decentralised version of the Byzantine Generals' Problem, and creating a virtual append-only log of transactions shared across many computers that would ordinarily be adversarial. This wasn't just a neat trick; it was a paradigm shift, but for space, we'll gloss over the full details. The point was that again, Bitcoin stored the transaction (message) and not the state. This was not a classical database! It was up to each client to scan back through the history and (a) calculate its own useful state (b) prove the intermediary transactions that supported that state, and (c) act on that state. Just like my design, you could kill your software any time, then start it up again, and it would be correct. Eventually. And here is where we reach the point of disagreement. Ethereum came along and didn’t, in this respect, follow the strong design limitation of Bitcoin and my software by praying at the church of the message. Instead, Ethereum in some sense preferred to pray at the church of the state. > I was heavily against this philosophy, because it would imply that users have no way to get the state other than either (i) running a node that processed every transaction in all of history, or (ii) trusting someone else. Quite reasonable thinking, indeed, as anyone with any knowledge about software will be scratching their heads about the above and asking, “but how does it scale? how long to count up all those messages?” This is the moment where theory collides with practicality. The theory of this is quite simple: we can agree on what a message is, because it’s right there in front of us, in bytes. Either we have the same bytes or we don’t, and we have new-fangled cryptography tools like hashes and digsigs to make it easy to agree. But, the users want state. They want to know how much they have! Don’t tell me about messages, just tell me how to get the dopamine hit of buying that hot new thing on Temu, Now, This Minute! > The idea of average users personally validating the entire history of the system is a weird mountain man fantasy. There, I said it. Not to mention the notion that no user would ever have the patience to do this… Anyways. How do we agree on state? It’s a lot harder. Which is why I left it out. As did Bitcoin. > In blockchains that commit to the state in the block header (like Ethereum), you can simply prove any value in the state with a Merkle branch. Err, really? Let’s take the state of say (2+2). Most software would say that’s 4. But now switch to floating point [5], choose some more interesting numbers like e & pi, 2 different lengths of floating point, 16 different language implementations, small and large clients, add in Intel’s FP bug, replace + with / and add an occasional blooper in the standard(s). Inevitably, we’re going to end up with a situation where the majority are saying (2+2) equals 5, and a vocal minority are tearing their hair out. In blockchain, this is called a fork. On the other hand, let’s go back to the message. We cannot have this argument because the message is the message. It’s binary data, bits and bytes in a sequence, and it can be hashed & digsig’d to our hearts content. There’s one things we can all agree on - the message. Walking back from a failure of state is very hard. Walking back from a failure of message is very easy. The bytes do not lie. When the messages get in a mess, this is something the software can sort out - we do not need to “CALL THE DEVS” we just ignore the bad messages. When state breaks, the only thing that can be done is “CALL THE DEVS!” And send them into the twilight zone. This is why The Two Generals Problem of Lamportian fame is about the message that was sent, not about what the Generals did with that message. This is a fundamental problem of computer science, sometimes call The Coordination Problem, and it’s a bit tricky [6]. An assumption with Lamport’s problem was that the message either got through or it did not - there was no inbetween. Luckily, protocols and later on, hashs and digsigs showed us how to do exactly that - make the message perfectly delivered or perfectly disappeared. The blockchain that is solid is the message oriented blockchain. Praying also at the church of state is a step too far, one that software isn’t ready to support. For that you need to call the devs. And that’s not a good thing. The software is supposed to work without the devs. To achieve this, Bitcoin does: 1. Receive a message, 2. check if the message is good, and 3. add it to the block. That’s it. Notice how there isn’t any value in the above, as a message only gets added if it passes the check in 2. This also shows a bit more of the genius of Bitcoin, in that the verification step in 2. was expanded from classical format thinking to say that the message was a small program that when run, returned TRUE or FALSE. If TRUE, it’s good. If not, ignore. In essence, Bitcoin abstracted the quality check in 2. up to a higher layer. Bitcoin doesn’t even do money, at the base layer at least! Instead it abstracts all that away to a higher layer of small programs called (oddly) smart contracts, one of which can also act as a money. And while we’re thinking of this second layer of programmed truth, note also that each smart contract maintains its own state. Bitcoin didn’t need to commit to the state because that was the job of the smart contract. Back to Vitalik. Obviously, the scaling issue is unanswered. Right. But, before any thinking on the scalability issue is possible, we first have to set what the foundation is. You can build a house of cards, you can build a sand castle, but neither of these can you live in. I built something to live in, and I built it on the rock of messages. Not the sand of state. Then, later on, we scaled it [7]. And this might be the reason for Vitalik’s digression into ZK-SNARKS. As soon as you obsess about the scaling, that verification step looms large. How then to do those verification steps each time you start up fresh? Fast ZK-SNARKS might be that answer. I don’t if they are, they’re above my paygrade.
iang tweet media
vitalik.eth@VitalikButerin

I no longer agree with this previous tweet of mine - since 2017, I have become a much more willing connoisseur of mountains. It's worth explaining why. x.com/VitalikButerin… First, the original context. That tweet was in a debate with Ian Grigg, who argued that blockchains should track the order of transactions, but not the state (eg. user balances, smart contract code and storage): > The messages are logged, but the state (e.g., UTXO) is implied, which means it is constructed by the computer internally, and then (can be) thrown away. I was heavily against this philosophy, because it would imply that users have no way to get the state other than either (i) running a node that processed every transaction in all of history, or (ii) trusting someone else. In blockchains that commit to the state in the block header (like Ethereum), you can simply prove any value in the state with a Merkle branch. This is conditional on the honest majority assumption: if >= 50% of the consensus participants are honest, then the chain with the most PoW (or PoS) support will be valid, and so the state root will be correct. Trusting an honest majority is far better than trusting a single RPC provider. Not trusting at all (by personally verifying every transaction in the chain) is theoretically ideal, but it's a computation load infeasible for regular users, unless we take the (even worse) tradeoff of keeping blockchain capacity so low that most people cannot even use the chain. Now, what has changed since then? The biggest thing is of course ZK-SNARKs. We now have a technology that lets you verify the correctness of the chain, without literally re-executing every transaction. WE INVENTED THE THING THAT GETS YOU THE BENEFITS WITHOUT THE COSTS! This is like if someone from the future teleported back into US healthcare debates in 2008, and demonstrated a clearly working pill that anyone could make for $15 that cured all diseases. Like, yes, if we have that pill, we should get the government fully out of healthcare, let people make the pill and sell it at Walgreens, and healthcare becomes super affordable so everyone is happy. ZK-SNARKs are literally like that but for the block size war. (With two asterisks for block building centralization and data bandwidth, but that's a separate topic) With better technology, we should raise our expectations, and revisit tradeoffs that we made grudgingly in a previous era. But also, I have actually changed my mind on some of the underlying issues. In 2017, I was thinking about blockchains in terms of academic assumptions - what is okay to rely on honest majority for, when we are ok with 1-of-N trust assumption, etc. If a construction gave better properties under known-acceptable assumptions, I would eagerly embrace it. On a raw subconscious level, I don't think I was sufficiently appreciative of the fact that _in the real world, lots of things break_. Sometimes the p2p network goes down. Sometimes the p2p network has 20x the latency you expected - anyone who has played WoW can attest to long spans of time when the latency spiked up from its usual ~200ms to 1000-5000ms. Sometimes a third party service you've been relying on for years shuts down, and there isn't a good alternative. If the alternative is that you personally go through a github repo and figure out how to PERSONALLY RUN A SERVER, lots of people will give up and never figure it out and end up permanently losing access to their money. Sometimes mining or staking gets concentrated to the point where 51% attacks are very easy to imagine, and you almost have to game-theoretically analyze consensus security as though 75% of miners or stakers are controlled by one single agent. Sometimes, as we saw with tornado cash, intermediaries all start censoring some application, and your *only* option becomes to directly use the chain. If we are making a self-sovereign blockchain to last through the ages, THE ANSWER TO THE ABOVE CONUNDRUMS CANNOT ALWAYS BE "CALL THE DEVS". If it is, the devs themselves become the point of centralization - they become DEVS in the ancient Roman sense, where the letter V was used to represent the U sound. The Mountain Man's cabin is not meant as the replacement lifestyle for everyone. It is meant as the safe place to retreat to when things go wrong. It is also meant as the universal BATNA ("Best Alternative to a Negotiated Agreement") - the alternative option that improves your well-being not just in the case when you end up needing it, but also because knowledge of it existing motivates third parties to give you better terms. This is like how Bittorrent existing is an important check on the power of music and video streaming platforms, driving them to offer customers better terms. We do not need to start living every day in the Mountain Man's cabin. But part of maintaining the infinite garden of Ethereum is certainly keeping the cabin well-maintained.

English
3
4
7
633
teaconf retweeté
M1K4
M1K4@mksala·
Stumbled upon this great article from Ian about triple-entry bookkeeping - a famous trait of Bitcoin and blockchains as public ledgers - in which a cryptographically sealed record leads to “the receipt is the transaction” and the phenomenon: “I know that what you see is what I see.”
iang@iang_fc

Thoughts on momentum accounting financialcryptography.com/mt/archives/00… I see a connection between Ijiri’s momentum accounting and cryptographic receipts, both called triple entry, so I'll try and draw it out. 1/10

English
0
3
3
205
teaconf retweeté
Matthew Prince 🌥
Matthew Prince 🌥@eastdakota·
Yesterday a quasi-judicial body in Italy fined @Cloudflare $17 million for failing to go along with their scheme to censor the Internet. The scheme, which even the EU has called concerning, required us within a mere 30 minutes of notification to fully censor from the Internet any sites a shadowy cabal of European media elites deemed against their interests. No judicial oversight. No due process. No appeal. No transparency. It required us to not just remove customers, but also censor our 1.1.1.1 DNS resolver meaning it risked blacking out any site on the Internet. And it required us not just to censor the content in Italy but globally. In other words, Italy insists a shadowy, European media cabal should be able to dictate what is and is not allowed online. That, of course, is DISGUSTING and even before yesterday’s fine we had multiple legal challenges pending against the underlying scheme. We, of course, will now fight the unjust fine. Not just because it’s wrong for us but because it is wrong for democratic values. In addition, we are considering the following actions: 1) discontinuing the millions of dollars in pro bono cyber security services we are providing the upcoming Milano-Cortina Olympics; 2) discontinuing Cloudflare’s Free cyber security services for any Italy-based users; 3) removing all servers from Italian cities; and 4) terminating all plans to build an Italian Cloudflare office or make any investments in the country. Play stupid games, win stupid prizes. While there are things I would handle differently than the current U.S. administration, I appreciate @JDVance taking a leadership role in recognizing this type of regulation is a fundamental unfair trade issue that also threatens democratic values. And in this case @ElonMusk is right: #FreeSpeech is critical and under attack from an out-of-touch cabal of very disturbed European policy makers. I will be in DC first thing next week to discuss this with U.S. administration officials and I’ll be meeting with the IOC in Lausanne shortly after to outline the risk to the Olympic Games if @Cloudflare withdraws our cyber security protection. In the meantime, we remain happy to discuss this with Italian government officials who, so far, have been unwilling to engage beyond issuing fines. We believe Italy, like all countries, has a right to regulate the content on networks inside its borders. But they must do so following the Rule of Law and principles of Due Process. And Italy certainly has no right to regulate what is and is not allowed on the Internet in the United States, the United Kingdom, Canada, China, Brazil, India or anywhere outside its borders. THIS IS AN IMPORTANT FIGHT AND WE WILL WIN!!!
Matthew Prince 🌥 tweet media
English
2.4K
7.3K
34.5K
9.5M
teaconf retweeté
Simon Taylor
Simon Taylor@sytaylor·
WOW: Rain just raised $250M at a $1.95B valuation 🤯 17x increase in 10 months. 30x card growth. 38x payment volume growth. Stablecoin linked cards are on fire --- What is Rain actually building? - Rain is a Visa Principal Member. - That's the same direct Visa relationship as JPMorgan Chase — not a BIN sponsor, not a program manager. Direct settlement. Direct product development. Direct control. For a company founded in 2021, that's almost unheard of. --- Why does that matter? - Because Rain doesn't compete with stablecoin issuers. - It makes stablecoins spendable. - Anywhere Visa is accepted. 150 million merchants. 150+ countries. The user sees a dollar balance. The merchant sees a normal Visa transaction. The stablecoin never surfaces. --- Visa can authorize 65,000 transactions per second, 24/7. But most bank settlement still follows bank hours. Authorize on Friday, holiday on Monday, settle on Tuesday. Credit risk builds up in those gaps. Rain settles in USDC (which Visa has enabled) That means 7-day settlement. No bank holiday delays. Reduced collateral requirements. Working capital that actually works. As @raincards CEO put it: "It's difficult to imagine a world in 2030 where robots and spaceships coexist with bank holidays and wire cutoff times." --- The Western Union partnership tells you where this is headed. A 174-year-old remittance company just launched their own stablecoin (USDPT on Solana) and picked Rain as the first infrastructure partner for their Digital Asset Network. That's 400,000+ retail locations. Stablecoin to cash conversion at the corner store. --- The founders' history matters. Farooq Malik was CIO at the North American Development Bank. He saw cross-border payments break constantly and looked for a better rail. Charles Naut is Stanford CS, founded a company acquired by Intuit, then was principal engineer at QuickBooks. He's built infrastructure at scale before. --- Rain is processing $3B+ in annualized volume across 200+ partners. - Western Union, - Nuvei, - KAST. And something I keep hearing? "This is one of the fastest growing companies in our portfolio" From multiple investors. Watch out for Rain making it Rain.
Simon Taylor tweet media
English
36
54
674
91.3K
teaconf retweeté
CoinGeek
CoinGeek@RealCoinGeek·
January 9th is the day Satoshi Nakamoto released Bitcoin v0.1 to the world, marking a major turning point in electronic cash. It opened the door to real utility and global use. Happy Bitcoin Birthday! 🥳
CoinGeek tweet media
English
3
16
41
1.3K
teaconf retweeté
Konstantinos Sgantzos
Konstantinos Sgantzos@CostaSga·
Very few people understand the importance of Triple-Entry Accounting (TEA) in AI and other fields of technology. Highlights from the 2nd international TEA conference that took place this year by @RebeccaLiggero Many, many thanks 🙏 youtube.com/watch?v=VbaH6M…
YouTube video
YouTube
English
3
4
15
621
teaconf retweeté
Dr Clare Craig
Dr Clare Craig@ClareCraigPath·
@alex_prompter This misses three key points. Feeding AI high quality material (like a judgement) will produce a good edit - not the same as it producing high quality material. Judges are meant to bring humanity and morality to work. It is demeaning to put your case to an algorithm.
English
22
8
188
14.5K
teaconf
teaconf@teaconf·
@lucasgonzalez @mbauwens 3/3 I am sad to be a pessimist but fear now rules greed and neither is rational ... I see no safe transition from 8 billion wannabe Californians to 3 bln CostaRicans, so I expect Armageddon. May Fortune favour you. 🖖
English
0
0
1
27
teaconf
teaconf@teaconf·
@lucasgonzalez @mbauwens We had homeostasis and beauty and complexity .. but pls recall that Nature was still "red in tooth and claw" ... every single day Further, DNA has many of the characteristics of distributed ledgers & block chains ... The third chimpanzee is a flawed evolute headed for .. 1/n
English
2
0
1
24
FluSCIM Lucas Gonzalez
FluSCIM Lucas Gonzalez@lucasgonzalez·
"The question now is whether we can evolve it from spreading inflammation and misinformation to enabling respectful coordination." Thread!
Michel Bauwens@mbauwens

Important: * The Technological Synapses of Planetary Intelligence Richard David Hames: "I’m often asked what specific technologies offer the greatest potential for planetary self-regulation. The question reveals a fundamental misunderstanding of how complex systems achieve homeostasis. We’re not looking for silver bullets but for interconnected capabilities that, when woven together, create emergent regulatory properties. Think less about individual technologies and more about technological ecosystems that mirror and enhance Earth’s existing feedback mechanisms. This clarification also gives me a chance to be optimistic for a change, rather than continue in the pessimistic mode most readers have come to know me. But optimism with caveats as you will see... The most transformative potential lies in “sensing-response architectures”—integrated systems that monitor planetary vital signs and trigger adaptive responses across multiple scales simultaneously. Earth observation satellites coupled with artificial intelligence don’t just collect data; they’re evolving into a planetary nervous system capable of detecting perturbations in real-time. When deforestation accelerates in the Amazon, when methane plumes erupt from Siberian permafrost, when ocean currents shift their ancient patterns, these systems increasingly enable a rapid response rather than delayed recognition. For example, distributed sensor networks are fundamentally altering our relationship with atmospheric chemistry. We’re moving from sparse, delayed measurements to dense, continuous monitoring that can track carbon flows at the resolution of individual facilities, forests, and even fields. This granularity transforms accountability from abstraction to precision. When every emission source becomes visible and every sink is quantified, the atmosphere shifts from commons to managed system. The technology exists; what’s emerging is the institutional capacity to act on what we see. Critics will say that sensing without response is just sophisticated observation of catastrophe and they are correct. The genuinely revolutionary technologies are those that close feedback loops at planetary scale. Direct air capture and carbon mineralisation don’t just remove CO₂; they create the possibility of actively managing atmospheric composition. We’re developing the capability to dial greenhouse gas concentrations up or down, and to engineer the climate with a precision we once reserved for indoor environments. This is profound—we’re evolving from climate victims to climate operators. The energy transition technologies—solar, wind, batteries—matter less for their specific capabilities than for what they represent: humanity’s first attempt to align its metabolic processes with planetary flows. When civilisation runs on current solar income rather than fossil geological deposits, we synchronise with Earth’s natural rhythms rather than disrupting them. Advanced geothermal and fusion represent the next phase—tapping effectively infinite energy sources that decouple human flourishing from ecological destruction. I happen to believe synthetic biology offers the most profound intervention potential in this regard. We’re not just engineering organisms; we’re designing new biogeochemical cycles. Bacteria that eat plastic and excrete useful chemicals. Algae that capture carbon while producing proteins. Corals engineered for heat resistance. Forests optimised for carbon sequestration. We’re acquiring the ability to reprogram the biosphere’s fundamental operating system, and to enhance Earth’s natural regulatory mechanisms rather than simply disrupting them. The convergence of AI with Earth system science is also creating unprecedented anticipatory abilities. Machine learning models trained on decades of satellite data can now forecast deforestation, predict crop failures, anticipate extreme weather events with increasing precision. But prediction alone doesn’t constitute regulation. The breakthrough comes when these predictive models are incorporated into response systems—when the forecast of drought automatically triggers water conservation protocols, when predicted deforestation alerts generate immediate economic sanctions, when anticipated crop failures initiate food system adaptations. Blockchain and distributed ledger technologies, despite their hype-clouded reputation, offer crucial infrastructure for planetary coordination. They enable transparent, verifiable tracking of carbon credits, biodiversity offsets, and resource flows without centralised authority. This matters because planetary regulation cannot depend on any single government or institution. It requires coordination mechanisms that function across borders, ideologies, and timescales—exactly what distributed consensus systems provide. The materials revolution—graphene, metamaterials, programmable matter—seems distant from planetary regulation until you recognise that civilisation’s physical substrate determines its ecological footprint. Buildings that capture more energy than they consume, materials that self-repair rather than requiring replacement, infrastructure that enhances rather than disrupts ecosystem services—these technologies transform human presence from extractive to regenerative."

English
1
1
2
258
teaconf
teaconf@teaconf·
@lucasgonzalez @mbauwens 2/2 for self extermination & possible near total ecocide. Goodwill & clever design has never overcome the fear/greed dynamic in mammal hindbrain .. AI, satellites & BTC won't change that... People learn faster than they evolved & I sometimes wish it was the other way round ..
English
0
0
1
23
teaconf retweeté
sysxplore
sysxplore@sysxplore·
No disrespect to Linus Torvalds, but this guy is the greatest geek alive 🫡 Created UNIX in 1971 when he was 28 years old. Created Go in 2009 when he was 66 years old. He also developed the B programming language (which led to C), created UTF-8 encoding (making international text possible online), and designed essential tools like grep that developers still rely on daily. He also helped with the development of Multics (that led to UNIX), Plan 9 from Bell Labs and Inferno operating systems. That's 4 operating systems in total... Most people don't even use these many OS. Pretty impressive resume, right? And it's a shame that many people, even the ones in the IT and tech industry, don't know him. Ken Thompson.... Remember the name 🙏
sysxplore tweet media
English
258
1.6K
13.1K
603.8K