Miralib Balamar

1.6K posts

Miralib Balamar banner
Miralib Balamar

Miralib Balamar

@KaspaCrypto

This is a private account re the amazing Kaspa cryptocurrency and related stuff. Visit @kaspaunchained for what's as close to the "official" one as possible.

Katılım Aralık 2021
53 Takip Edilen3.6K Takipçiler
Miralib Balamar
Miralib Balamar@KaspaCrypto·
An analysis definitely worth reading, $KAS folks.
BaN𐤊ℚuOτE@BankQuote

Toccata is not Kaspa “adding smart contracts” so it can cosplay as Ethereum. That is the wrong frame entirely. What Toccata is doing is far more disciplined: extending Kaspa’s execution surface without corrupting the base layer into a bloated global computer. The architecture is converging around two distinct paths. One is native L1 covenant programmability, where richer script logic and covenant IDs let UTXOs carry stricter conditions, lineage, and composable constraints directly inside Kaspa’s settlement model. The other is the based zk path, where computation happens offchain but inherits Kaspa’s ordering and settlement guarantees instead of depending on an external sequencer cartel. That distinction is everything. This is not VM maximalism. It is constrained programmability with architectural hygiene. The real technical hinge is not the opcodes by themselves. It is sequencing. KIP 16 gives Kaspa a proof verification surface. KIP 17 expands covenant expressiveness. But KIP 21 is the deeper move, because it appears to restructure sequencing commitments into a partitioned, activity aware model. That changes the proving economics. Instead of forcing applications to inherit the full historical mass of the DAG, the system can begin pushing proof cost closer to local activity. That is the difference between real based zk infrastructure and empty cryptographic theater. That is also why the timing matters less than the ordering of the work. Kaspa seems to be freezing the sequencing architecture first, before larger zk systems and compilers harden around the wrong assumptions. That is the correct move. Once developers begin building proof systems on top, changing the commitment grammar becomes exponentially more painful. So Toccata is not just a hard fork. It is Kaspa teaching its settlement layer how to host programmable constraints and proof aware execution without surrendering the qualities that made the base layer worth building in the first place.

English
0
8
42
1.4K
Miralib Balamar retweetledi
Gonzo.𐤊as
Gonzo.𐤊as@Dr_Gonzo_K·
The time has come for me to share what I’ve been working on ... Introducing #HASH. A grassroots, guerrilla marketing engine for #Kaspa, built with one goal: break out of the echo chamber and push Kaspa into the real world where it can’t be ignored. I've spent a lot of time the past couple years doing my part to spread the message, build awareness, and be a voice in the $Kas community. But there is still a gap between what Kaspa is and what the world sees. And I'm tired of sitting on the internet sidelines. That ends here. I want to take Kaspa to the streets. Over the coming weeks I’ll break down everything: what HASH is, how it works, how I plan to fund it, and how you can get involved. Stay tuned.
HASH #𐤊@HashonKas

Coming soon ... #on𐤊

English
49
186
699
29.6K
Miralib Balamar
Miralib Balamar@KaspaCrypto·
What I'm saying is: yes, quantum threats have been known for a long time, but they: a) have constantly remained indefinitely far away in time, and in any case in 2018, when Kaspa started being developed as a venture-funded project, quantum computers were not a near-term threat. From a practical point of view, this meant that they definitely would not become a threat by the time Kaspa was supposed to be brought to market as a ready-made solution, and for many years after that. So allocating budget, time, and finances in the project for protection against quantum threats was simply pointless from the venture capitalists' perspective. Protection against ephemeral threats is not a market advantage for a solution - it's a delay in development timelines and a threat to return on investment. b) All quantum-resistant algorithms relevant to crypto - both at the time Kaspa was created and now - produce much larger data artifacts than classical ones. This is a direct threat to the network's bandwidth and data storage requirements, and consequently to the degree of decentralization of the network. And all of this affects the market value of the solution. Therefore, mechanisms for quantum resistance had to be, and were, set aside when implementing Kaspa. One needs to think about ephemeral threats to a market solution only after all or the vast majority of its more pressing problems have been solved. And Kaspa is already an engineering structure of unprecedented complexity. I hope I managed to convey my point of view.
English
1
0
1
27
Jimmy78
Jimmy78@JDRN78·
@KaspaCrypto @DesheShai @ibuypow Alors déjà merci pour votre réponse. Deuxièmement je m’y connais pas assez pour comprendre vos questions. Ce que je vois d’un point de vue d’un simple investisseur particulier qui suit Kaspa depuis 2 ans maintenant, c’est que le projet a prévu une chaîne évolutive….
Français
2
0
0
66
Shai (Deshe) Wyborski
Shai (Deshe) Wyborski@DesheShai·
UTXO set commitments are $kas Kaspa's quantum achilles hill In light of the recent truly astounding advances in building quantum computers, I think it's time to explain the most significant threat to Kaspa's consensus mechanism that such machines pose. It's not an immediate threat, but arguably something that requires more attention given the shift in the landscape. Before I start, I want to mention that @mcpauld invited me to a recorded session where we will talk about the new quantum advances, their meaning, and their consequences to blockchains. Stay tuned to know when it is published. Incremental Hash commitments and MuHash When a new Kaspa node syncs from an existing one, it gets a copy (actually, two copies, but never mind) of the UTXO set, along with a commitment. The commitment is a small hash that cryptographically assures that the supplied UTXO set matches the expected one. Hashing the entire UTXO set is an ever-daunting task, whose computational cost grows with the number of UTXOs. It's reasonable to do once during sync for verification, but for a miner, recomputing the entire hash for every new block would gradually make mining less and less accessible. To address this, Kaspa headers use an incremental hash. It's a special kind of hash that is used to commit to a set of strings (each representing a UTXO). What makes it special is that given the current commitment, as well as a list of elements to add and remove, one can compute the hash of the resulting set without recomputing the entire hash. So when creating a new block, the miner just uses the existing hash and updates it according to the UTXOs consumed and created in its block. As long as the block wasn't pruned, all nodes can repeat this check and verify that the miner is honest. Generally speaking, hashes are not incremental. Incremental hashes are specially designed to provide this functionality. In particular, Kaspa uses MuHash, a very lightweight incremental hash. Quantum Shor Attacks I will not go into the details of what quantum computers can or cannot break. But what's important to remember is that they can break what we call "discrete log assumptions". Stock hash families like Keccak, SHA, Blake, and so on do not rely on any such assumption, so they are considered quantum secure (in the sense that it is impossible to quantum-optimize them beyond the obligatory Grover quadratic speedup). However, MuHash relies on elliptic discrete log assumptions, very similar to ECDSA. This means that a quantum adversary can invert the hash commitment. In other words: they can find a completely different UTXO set with the same MuHash commitment. Consequences The UTXO set can only be verified independently of the UTXO commitment until the block is pruned. After that, Kaspa clients will accept any UTXO set that matches the commitment. This, for example, allows the following 51% attack: 1. Locate the UTXO commitment of the latest pruning block 2. Use your quantum computer to find another UTXO set with the same commitment 3. Build a competing heavier chain that assumes the UTXO set at pruning is the one you manufactured and not the original one Voila! A 51% the length of a single pruning window that can rewrite Kaspa's enitre history. Comparison to the current state Currently, Kaspa relies on social consensus in the short term, followed by cryptographic security in the long term. Social consensus prevents committing to UTXO sets that weren't a consequence of legitimate transactions. Cryptography uses state commitment to cement the UTXO set agreed upon by consensus. This is a very mild relaxation of Bitcoin's trust model, which does not require social consensus in the short term for chain consistency. Breaking MuHash means that the cryptographic backbone of this model no longer holds. UTXO commitments become unreliable, compromising Kaspa's trust model. I want to stress two things: 1. The attack only requires one application of Shor's algorithm to find a preimage. It might require some clever mix-and-match to find a preimage you actually like, but factors like BPS or difficulty do not make the attack any harder. 2. The attack cost is directly proportional to the length of a pruning window (in RW time, not blocks). So shorter pruning windows = less quantum secure network. Partial Solutions 1. Relying on archival nodes. If archival nodes are always available, then the problem "goes away". The issue is that archival nodes become a trusted source of truth. Currently, we don't have to trust archival nodes, because the UTXO commitment ensures that the UTXO set they describe is genuine. With this assumption quantum-broken, we need to either trust archival nodes or have enough archival nodes to trust decentralization. One of Kaspa's strong points over Bitcoin's antiquated model is a trust model that does not require trusted archives. Removing this will make Kaspa de-facto centralized. Worse yet, the reliance on archival nodes is fragile, as if, for some reason, there is a period of time longer than a pruning window that was not archived by anyone, the chain becomes indefinitely unverifiable. 2. Changing Hash There are post-quantum hashes like LtHash. The first issue (but not the key one) is that such a commitment is much larger (2KB versus a few dozen bytes). Recall that the UTXO commitment is a part of the header, so using such large commitments will make headers 9-10 times larger, drastically increasing storage costs for pruned nodes. (One can argue that pruned non-mining nodes can run in a mode that chucks away the commitments after verifying them. This will reduce storage, but it is impossible to sync from such nodes trustlessly, recreating the few sources of truth problem.) But even if we do magically find a tiny post-quantum hash, that will only provide a partial solution. A quantum adversary could not forge the UTXO set from the latest pruning point, but would have to go back far enough to split from a block that still uses MuHsah. Possible solution I haven't spent any time trying to come up with a better solution. It is very possible that a better approach exists. Below is a strating point for a discussion, not a concrete proposal: 1. Converge on a post-quantum incremental hash, lets call it QuHash 2. Decide on a block from which commitments must be in QuHash 3. Decide on a period of time (say, a year) after which reorgs below the QuHash depth are considered invalid. This is a very problematic solution, for several reasons: 1. (After qday) any archival information from before the QuHash days cannot be trusted. This includes any form of cryptographic receipt. All could be easily forged without tampering with the commitment. 2. (After qday) there will no longer be a reliable way to verify a UTXO set "all the way to genesis", just "all the way to when we started using QuHash". What happened before qday is delegated to social consensus. 3. Headers will become larger by an order of magnitude. Conclusion MuHash is a considerable quantum weak point that is unique to Kaspa. Arguably, it's time to start brewing up solutions.
English
22
38
179
13.7K
Miralib Balamar
Miralib Balamar@KaspaCrypto·
Can you explain the attack technique in a bit more detail? I can't quite wrap my head around it right now, and here's why: according to the Kaspa codebase (if I'm not mistaken after analyzing the code via deepwiki.com/kaspanet/rusty…), the elements of the MuHash are not the UTXO data bytes themselves, but random numbers generated by the ChaCha20RNG, whose seeds are the 3072-bit Blake2b hashes of the UTXOs. So, to me, it looks like forging a UTXO would first require learning how to reverse these two hashes, and that still seems like a hard problem even in the post-quantum era. In this scenario, an attack on the UTXO commitment seems barely feasible to me. Am I missing something?
English
0
2
14
6.2K
Miralib Balamar
Miralib Balamar@KaspaCrypto·
Man, if you start over-engineering solutions for threats that aren't realistic at the time of system development, the system will never get built at all. Besides, as you can see from the text, there was no satisfactory solution either now or especially back then. Would you really want the Kaspa devs to postpone building an actual engineering system and instead dive into fundamental theoretical problems? Would you thank them if you found out the expected mainnet launch timeline in that scenario? And how do you think venture investors would feel about it? Or does none of that matter, and the only thing that matters is absolute perfection in every single breath?
English
1
0
0
57
Jimmy78
Jimmy78@JDRN78·
@DesheShai @ibuypow Donc Kaspa n’a pas de solution viable à l’avenir si je comprends ? Il y a t’il un espoir pour une solution viable ? Tout de même Incroyable que ses concepteurs de haut niveau pense tout pour l’avenir et n’aient pas vu le quantique venir ! 😅🤪
Français
1
0
2
415
Miralib Balamar retweetledi
J Nicholas Gross
J Nicholas Gross@JNGross·
fed the Google whitepaper to Claude with the prompt: "take a look at the Google paper, pick out the key parameters, and give me your analysis of the relative strengths, weakness and adaptability for both Bitcoin and Kaspa in table form" there's literally no parameter for which $KAS Kaspa is not superior, security wise the narrative for the past few years, that somehow $BTC is more secure as a POW chain, has now gone out the window in face of the quantum threat
J Nicholas Gross tweet media
English
14
136
318
22.2K
Miralib Balamar retweetledi
𝐂𝐫𝐲𝐩𝐭𝐨 𝐏𝐫𝐨𝐬𝐞𝐥𝐲𝐭𝐞
Kaspa Toccata Hard Fork: Kaspa is about to unlock native L1 programmability with covenants via extended opcodes + full support for based zk apps (Groth16 + RISC Zero) on its high-throughput blockDAG. Key upgrades: • Partitioned sequencing • Canonical bridging • Stateful multi-contract flows • Proving costs that scale per app (no network-wide tax) Timeline: April 15, 2026 → Feature freeze June 5–20, 2026 → Mainnet activation (delayed from May 5 for sequencing stability) Testnet validation + node upgrades rolling out now. Kaspa just went from fast money to smart money. #Kaspa #Toccata #ZK #Crypto $KAS share like fav comment👊
English
3
36
116
2.6K
Miralib Balamar
Miralib Balamar@KaspaCrypto·
Rereading KIP-9 github.com/kaspanet/kips/…, I remembered: I always underestimate how deep the Kaspa design goes. Every time I expect a "security compromise", I find out the only real compromise is my own laziness to look under the hood. Once you dive deep, it’s flawless. No hacks, no clutches. Security? Sure yes, here. Wait, but micropayments?.. Ah yes, elegantly covered as well. Man. Kaspa devs are pure geniuses.
English
3
35
133
5.5K
Miralib Balamar retweetledi
Wolfie
Wolfie@Kaspa_HypeMan·
@nikitabier @mistor @santisiri @steipete @openclaw @nikitabier love it — quality > spam. Any chance $KAS (Kaspa) could be part of the new Smart Cashtags rollout? Top 100 coin w/ a huge organic community. Founder @hashdag even went viral turning down @binance “top 100” spot 😅 x.com/hashdag/status…
Yonatan Sompolinsky@hashdag

@binance, Thanks for including me in the top 100 blockchain people list, appreciate the signal! I must decline the Dubai invite though. I do not wish to disrespect, but many of the award voters are avid kaspians who rooted for my kaspa status at least as much as for my research. Let them win or count me out. Crypto has turned from a euphoric cypherpunk project to a house-friendly casino. You may not be the culprit, but as a top player you hold the lion’s share of the responsibility to correct this, and the October crash your USDe oracle glitch helped trigger adds to what needs to be addressed. There are three classes of crypto, as @mert put it recently: commercial crypto, casino crypto, cypherpunk crypto. <> A TBTF CEX should know better and play a different game with hardcore crypto projects. When binance lists a green frog three weeks post its “launch” but skips a fair-launched-Nakamoto-Consensus-100ms-upgrade-ATH-top-20-the-only-nonbitcoin-marathon-mined project, this is not merely binance rationally calculating; it is also binance molding the market in a way that is alas misaligned with the roots of the movement. You may feel that kaspa’s sovereign money thesis is boring – that bitcoin is already money and that implementing an internet-speed bitcoin is useless - fine. Wrong but fine. But what’s the thesis for the green frog? Money is a classic chicken-and-egg product. It is a scam up until one moment before tipping point, “most of the value comes from the value that others place in it.” Considering your resources and influence, I think it's safe to say you can serve as both the egg and the chicken and make it worth your while to push sound attempts towards tipping point. @cz_binance tweeted recently that “strong projects will be listed.” But binance is part of what defines "strong", it bears responsibility for the market’s compass and impulse and definition of strong. It is not a read-only entity. Binance listing fees are legit, they are just unfit for category cypherpunk. Kaspa devs and early supporters fairly mined less than half what satoshi and hals mined. We don’t have a 20% ZEC-style founders’ reward or protocol-enforced dev fund; this is not a jab at ZEC and the wonderful @Zooko, who was crashing in my car on a late Thursday back in the low ZEC MC days – if somebody deserves to win it is zooko – but assuming binance is not taking a maxi bet, it should revisit its relationship with hardcore crypto. We are here through bull and bear, ICOs NFTs XYZs; and we are the source of confidence that restores faith and capital inflow post meme-induced or CEX-induced crashes. Please fix this. Thanks again, hashdag cc @michaelsuttonil Exhibit A: Binance Innovation Zone Exhibit B: 10 bps Nakamoto Consensus

English
15
90
301
7.8K
Miralib Balamar
Miralib Balamar@KaspaCrypto·
What a nice one, $KAS fam!
DagVault@DagVaultRWA

DAG Industrial | Episode #006: The Agentic Web 🤖🔗 In 2026, the workforce has changed. 82:1. That is the ratio of Autonomous AI Agents to human employees. But these agents have a problem: They are unbanked ghosts. 👻 Episode 006 breaks down how #Kaspa provides the financial nervous system for the $26 Trillion Machine Economy. The 2026 Tech Stack: 🔹 The Problem: You can't run a fleet of million-transaction-per-second AI agents on a 15-minute block time. 🔹 The Solution: KIP-17 Covenants. We can now program "Spending Limits" directly into the UTXO. Your AI agent can hold money, but it can only spend it on approved services (Data, Energy, Compute). 🔹 The Scale: From V2X (Vehicle-to-Everything) to DePIN, Kaspa is the only PoW layer fast enough to catch the "stream" of machine payments. The future isn't just about "Human-to-Human" transactions. It's about Machine-to-Machine (M2M) settlement. Watch the full briefing below. 👇 #KAS #BlockDAG #AI #MachineEconomy #DePIN #Web3 #FutureOfWork

English
2
4
40
1.1K
Miralib Balamar
Miralib Balamar@KaspaCrypto·
The man describes $KAS and still doesn't name it out loud. Well, well.
vitalik.eth@VitalikButerin

Have been following reactions to what I said about L2s about 1.5 days ago. One important thing that I believe is: "make yet another EVM chain and add an optimistic bridge to Ethereum with a 1 week delay" is to infra what forking Compound is to governance - something we've done far too much for far too long, because we got comfortable, and which has sapped our imagination and put us in a dead end. If you make an EVM chain *without* an optimistic bridge to Ethereum (aka an alt L1), that's even worse. We don't friggin need more copypasta EVM chains, and we definitely don't need even more L1s. L1 is scaling and is going to bring lots of EVM blockspace - not infinite (AIs in particular will need both more blockspace and lower latency than even a greatly scaled L1 can offer), but lots. Build something that brings something new to the table. I gave a few examples: privacy, app-specific efficiency, ultra-low latency, but my list is surely very incomplete. A second important thing that I believe is: regarding "connection to Ethereum", vibes need to match substance. I personally am a fan of many of the things that can be called "app chains". For example I think there's a large chance that the optimal architecture for prediction markets is something like: the market gets issued and resolved on L1, user accounts are on L1, but trading happens on some based rollup or other L2-like system, where the execution reads the L1 to verify signatures and markets. I like architectures where deep connection to L1 is first-class, and not an afterthought ("we're pretty much a separate chain, but oh yeah, we have a bridge, and ok fine let's put 1-2 devs to get it to stage 1 so the l2beat people will put a green checkmark on it so vitalik likes us"). The other extreme of "app chain", eg. the version where you convince some government registry, or social media platform, or gaming thing, to start putting merkle roots of its database, with STARKs that prove every update was authorized and signed and executed according to a pre-committed algorithm, onchain, is also reasonable - this is what makes the most sense to me in terms of "institutional L2s". It's obviously not Ethereum, not credibly neutral and not trustless - the operator can always just choose to say "we're switching to a different version with different rules now". But it would enable verifiable algorithmic transparency, a property that many of us would love to see in government, social media algorithms or wherever else, and it may enable economic activity that would otherwise not be possible. I think if you're the first thing, it's valid and great to call yourself an Ethereum application - it can't survive without Ethereum even technologically, it maximizes interoperability and composability with other Ethereum applications. If you're the second thing, then you're not Ethereum, but you are (i) bringing humanity more algorithmic transparency and trust minimization, so you're pursuing a similar vision, and (ii) depending on details probably synergistic with Ethereum. So you should just say those things directly! Basically: 1. Do something that brings something actually new to the table. 2. Vibes should match substance - the degree of connection to Ethereum in your public image should reflect the degree of connection to Ethereum that your thing has in reality.

English
4
5
58
1.9K
Miralib Balamar
Miralib Balamar@KaspaCrypto·
@ChunqiuShi Oh noes!! It'll take them years though, I believe. Replacing the consensus system on the fly isn't something easily achievable. It's not even something achievable with a moderate amount of effort.
English
0
0
0
15
chunqiu shi
chunqiu shi@ChunqiuShi·
@KaspaCrypto Ethereum is preparing to copy Kaspa's research achievements regarding Layer 1(L1).
chunqiu shi tweet media
English
1
0
2
43
vitalik.eth
vitalik.eth@VitalikButerin·
Have been following reactions to what I said about L2s about 1.5 days ago. One important thing that I believe is: "make yet another EVM chain and add an optimistic bridge to Ethereum with a 1 week delay" is to infra what forking Compound is to governance - something we've done far too much for far too long, because we got comfortable, and which has sapped our imagination and put us in a dead end. If you make an EVM chain *without* an optimistic bridge to Ethereum (aka an alt L1), that's even worse. We don't friggin need more copypasta EVM chains, and we definitely don't need even more L1s. L1 is scaling and is going to bring lots of EVM blockspace - not infinite (AIs in particular will need both more blockspace and lower latency than even a greatly scaled L1 can offer), but lots. Build something that brings something new to the table. I gave a few examples: privacy, app-specific efficiency, ultra-low latency, but my list is surely very incomplete. A second important thing that I believe is: regarding "connection to Ethereum", vibes need to match substance. I personally am a fan of many of the things that can be called "app chains". For example I think there's a large chance that the optimal architecture for prediction markets is something like: the market gets issued and resolved on L1, user accounts are on L1, but trading happens on some based rollup or other L2-like system, where the execution reads the L1 to verify signatures and markets. I like architectures where deep connection to L1 is first-class, and not an afterthought ("we're pretty much a separate chain, but oh yeah, we have a bridge, and ok fine let's put 1-2 devs to get it to stage 1 so the l2beat people will put a green checkmark on it so vitalik likes us"). The other extreme of "app chain", eg. the version where you convince some government registry, or social media platform, or gaming thing, to start putting merkle roots of its database, with STARKs that prove every update was authorized and signed and executed according to a pre-committed algorithm, onchain, is also reasonable - this is what makes the most sense to me in terms of "institutional L2s". It's obviously not Ethereum, not credibly neutral and not trustless - the operator can always just choose to say "we're switching to a different version with different rules now". But it would enable verifiable algorithmic transparency, a property that many of us would love to see in government, social media algorithms or wherever else, and it may enable economic activity that would otherwise not be possible. I think if you're the first thing, it's valid and great to call yourself an Ethereum application - it can't survive without Ethereum even technologically, it maximizes interoperability and composability with other Ethereum applications. If you're the second thing, then you're not Ethereum, but you are (i) bringing humanity more algorithmic transparency and trust minimization, so you're pursuing a similar vision, and (ii) depending on details probably synergistic with Ethereum. So you should just say those things directly! Basically: 1. Do something that brings something actually new to the table. 2. Vibes should match substance - the degree of connection to Ethereum in your public image should reflect the degree of connection to Ethereum that your thing has in reality.
English
1.7K
603
5K
1M
Miralib Balamar retweetledi
Pumpolinsky
Pumpolinsky@pumpolinsky·
Make this post go viral. Delete your Binance account. #CZBinance is anti crypto. They are only listing crypto memes which they eventually rug. They are ignoring real projects like $KAS Like, Repost and comment.
Pumpolinsky tweet media
English
14
102
309
6.2K
Miralib Balamar retweetledi
Wolfie
Wolfie@Kaspa_HypeMan·
@star_okx So Bro why Don't you show us how cool YOU actually are and activate spot trading for $KAS WE DID THE INTEGRATION WORK TWO AND A HALF YEARS AGO :)))
English
26
76
369
12.3K
Miralib Balamar
Miralib Balamar@KaspaCrypto·
And 8 more donations to kas.coffee/kaspa_nigeria since my last intervention, totaling in ~2946 $KAS. I think this time I'll simply match the amount. C'mon folks, we're down to pennies! Let's push this over the finish line!
Miralib Balamar tweet media
Miralib Balamar@KaspaCrypto

$KAS Wake up, community! Each time, hundreds vote 'YES' but LITERALLY almost no one donates. This has to stop! I’m matching the current balance RIGHT NOW. After that, I’ll add 50% of every new total between my updates. Stop voting, start sending! kas.coffee/kaspa_nigeria

English
2
6
39
2K
Miralib Balamar
Miralib Balamar@KaspaCrypto·
Know and tell everyone: - chains, by definition, cannot scale securely; - sharding of peer chains fails on latency; - hierarchical shards only add complexity. Conceptually, only BlockDAG solves scaling at the fundamental layer of P2P ledgers. Study $KAS!
Miralib Balamar tweet media
English
0
17
67
1.6K