Omar

2.4K posts

Omar banner
Omar

Omar

@Ozhar

Head of Business Development | Encrypted Capital Markets @zkSync

Katılım Haziran 2011
1.1K Takip Edilen8K Takipçiler
Omar retweetledi
vassilis (∎, ∆)
vassilis (∎, ∆)@TziokasV·
It was great to meet @patrickjwitt and the President's Council of Advisors on Digital Assets team earlier today. As momentum builds around the CLARITY Act, the foundation for the next phase of the digital asset economy is taking shape. We, at @zksync, are building for a future where stablecoins and tokenized deposits coexist and thrive enabling American banks of all sizes to stay competitive and build onchain securely and compliantly.
vassilis (∎, ∆) tweet media
English
3
12
51
4.2K
Omar
Omar@Ozhar·
The past few days it looks like we're back to debating about the purpose of blockchain. There is a reason the motto in the ZK universe has been Don't Trust, Verify. A common argument for permissioned ledgers is that transactions can be reverted when something goes wrong. A bug in a smart contract, a settlement error, a fat finger trade. Fair enough. Operational controls can still be implemented even on Ethereum and ZKsync. The issuer fully controls the smart contracts after all. But let me ask a different question. When FTX collapsed, was the problem that someone needed to undo a transaction? Or was it that no counterparty could independently verify whether customer assets actually existed? When Archegos imploded, was the problem a software bug? Or was it that prime brokers had no way to see that the same concentrated risk was sitting across all of their books simultaneously? When Three Arrows Capital was weeks into insolvency before anyone noticed, was the missing capability a revert button? Or was it the ability to verify solvency claims without relying on the word of the person making them? Revertibility solves for operational errors. That is a real category of problem. But it is not the category of problem that produces systemic crises. The failures that actually destabilize markets are not mistakes someone wants to undo. They are information asymmetries where one party’s claims cannot be verified by anyone else until it is too late. Zero-knowledge cryptography addresses this directly. A ZK proof of reserves allows a counterparty to confirm that assets back liabilities without trusting an attestation letter. A ZK proof of solvency lets a clearing network confirm a participant is not overleveraged without exposing their full position book. These are not governance decisions. They are mathematical guarantees. There is a deeper issue worth considering as well. Revertibility requires someone with the authority to decide what gets reverted. In a software bug scenario, that governance works fine. But in the scenarios that actually matter, the governing parties are often the ones who are compromised or conflicted. Governance that depends on trusted intermediaries fails precisely when trust has already broken down.
English
12
16
70
3.6K
Brother Odin 🥷🏽
Brother Odin 🥷🏽@odin_free·
ZK will eat the world!
ALEX | ZK@gluk64

Canton founders claim ZK proofs are too risky for institutional finance. They have been making this argument to buyers and regulators, publicly and behind closed doors. It deserves a public answer. Let's see if the argument holds — and if Canton's infrastructure passes its own test. The argument Their case, stated fairly: ZKPs are complex. Bugs are inevitable in any sufficiently complex system. If a flaw exists in a proof system, it could go undetected because the underlying data is private. If it goes undetected, it spreads throughout the system. This creates systemic risk. Therefore, ZKPs cannot be used for critical financial infrastructure. This is a real concern. Let's take it seriously and follow the logic. The flaw in the logic Strip away the ZKP-specific language, here's the story: Technology X can have implementation flaws. Technology X serves a mission-critical function. If it fails, the consequences are catastrophic. Therefore, Technology X can never be used. Read it again. There is a hidden assumption doing all the work: that Technology X is your only line of defense. If this logic held, we would not have aviation. Fly-by-wire, engine controllers, autopilot — every one of these systems has bugs, is mission-critical, and can fail catastrophically. Nuclear reactor control systems, robotic surgery, radiation therapy dosing, implantable cardiac devices, and many other systems all run on software that can fail catastrophically. But they are somehow still in use. How? Redundancy and containment The foundation for these mission-critical systems is the explicit assumption in their architectures that every component will eventually fail. They all rely on two things: redundancy and containment. Redundancy = multiple independent systems, each capable of catching a failure in the others. Containment = when failure occurs, limit the blast radius so it cannot become systemic. This is the only question that matters for any mission-critical system: does your architecture have more than one line of defense? Canton's architecture Let's apply this test to Canton. Canton's privacy and integrity model relies on a single mechanism: trusted operators segregating data between participants. There is no cryptographic verification layer and no independent check. If a few keys of the operators in a validation domain are compromised, manipulated state propagates silently inside opaque chains of UTXOs with nothing watching. This is a real systemic risk, accelerated by the rise of AI-assisted cyberattacks. By Canton's own logic — a single point of failure with catastrophic consequences — this is the architecture that should concern regulators. Prividium's architecture Now look at how Prividium is built. Redundancy. Prividium has three independent lines of defense. First, institutional partners operate Prividium nodes within their own security environments, the same infrastructure banks already trust and regulate. Second, zero-knowledge proofs provide cryptographic integrity verification as an independent layer on top, verifying operational security rather than replacing it. Third, as ZK proof systems standardize, multiple independent provers can verify the same computation. A flaw in one implementation gets caught by another. Containment. Each Prividium instance is an individual chain operated by an individual institution. When institutions interact across chains, Prividium's interop layer implements inter-chain accounting mechanisms that are independently enforced by the participating institutions, asset issuers, or on-chain. Even an attacker who compromises a single institution's internal IT infrastructure and simultaneously finds a ZKP bug could only affect that one Prividium instance. The damage cannot propagate to the broader network. The net balance: Canton has a single mechanism, no fallback, silent failure propagation across the network. Prividium has layered defenses, independent verification, blast radius contained by design. Importance of open standards Multiple lines of defense only matter if each line is itself strong. What makes a technology strong? The depth of adversarial testing it has survived. Shaul points to a compiler bug example in his post, and it actually illustrates this well. ZKsync embraced full EVM equivalence over a year ago. This was shaped precisely by the understanding that the more you deviate from an open standard, the larger your attack surface becomes. And Ethereum is not battle-tested in some polite, academic sense. For over a decade, its smart contract infrastructure has been completely open to scrutiny by the most sophisticated adversarial actors in the world, with hundreds of billions of dollars at stake. Vulnerabilities and exploits fed directly back into the ecosystem: new audit standards, formal verification tools, compiler safeguards, and hardened design patterns. The EVM that exists today is the product of a decade of continuous adversarial stress testing at a scale no other smart contract platform has experienced. Canton went the opposite direction. DAML is a proprietary smart contract language with a closed ecosystem and a fraction of the developer and security community. Every growing pain that Ethereum went through over the last ten years still lies ahead for DAML, except DAML will face them with orders of magnitude fewer eyes watching. Every maturity concern Canton raises about ZKPs applies to their own technology stack with far less mitigation available. The safest technology is the one that has survived the longest under the harshest conditions. For smart contract infrastructure, that is Ethereum. It's not close. So to answer the question directly: everyone agrees bugs exist. The question is whether your architecture has redundancy to catch them and containment to limit the damage when they slip through. Cryptographic verification provides both. Trust in operators provides neither.

English
5
2
50
2.6K
Omar retweetledi
Cyfrin Audits
Cyfrin Audits@cyfrin·
As of today, BattleChain testnet is LIVE. The pre-mainnet, post-testnet blockchain, where whitehats legally attack your smart contracts before they reach production. Deploy. Get attacked. Ship stronger. Here's why we built it, what it is, and how you can get involved 🧵
GIF
English
67
109
472
110.4K
Omar retweetledi
ALEX | ZK
ALEX | ZK@gluk64·
Everything in this essay is correct — but take it one step further and you'll land at an unexpected conclusion. Consumer crypto solved "don't trust corporations." Guess what? Corporations don't trust corporations either. Every major bank wants shared infrastructure for payments, settlement, tokenized assets. None will accept a competitor controlling it. That's why consortium projects die. Someone always ends up holding the keys. Public chains don't work directly because of compliance and privacy. Consortium chains don't work because every bank has seen this movie before. Early incumbents set the terms, late joiners absorb the cost. Nobody wants to join someone else's network. The only remaining option is to execute in a controlled environment but anchor the results to a settlement layer that no participant, and no future coalition of participants, can ever capture. That's what Prividium does on Ethereum. An Ethereum account gives a consumer assets no corporation can seize. Prividium gives an institution a controlled environment on shared rails no counterparty can capture. Prividium is for enterprise users what an Ethereum account is for individuals.
Omid Malekan@malekanoms

x.com/i/article/2034…

English
7
15
110
8.8K
Omar retweetledi
ZKsync
ZKsync@zksync·
"Prividiums give banks their own private rails but they will interoperate with every other bank building directly on Ethereum." @gluk64 at #DCBlockchain Summit on why Ethereum is the only credibly neutral and global settlement layer for the next era of institutional finance.
English
24
36
209
12.5K
Omar retweetledi
Milk Road
Milk Road@MilkRoad·
This is HUGE and everyone slept on it! (Bookmark this before the algo buries it). Five U.S. banks just announced they're putting deposits onchain: - Huntington Bancshares - Old National Bancorp - First Horizon - M&T Bank - KeyCorp They're all piloting the Cari Network - a tokenized deposit platform built on ZKsync's Prividium. Not a selection of crypto startups. Not a DeFi protocol. Five regulated U.S. banks. Here's what makes this different from every other blockchain-bank press release: Tokenized deposits aren't stablecoins. They're actual bank deposits represented onchain. Your money stays insured. It stays inside the regulatory perimeter. The bank doesn't change - but the rails do. This is banks rebuilding their plumbing. Why Prividium specifically? ZKsync's Prividium is a privacy-preserving layer built for institutions. Transactions are verified without exposing the underlying data. Regulators can still audit. Competitors can't see your positions. It's the piece that makes banks actually willing to show up. Here's what this actually changes: - Settlement that used to take days can happen in seconds - 24/7, including weekends and bank holidays. - Bank-to-bank transfers become programmable. Smart contracts can move money automatically when conditions are met, no human approval required. - Five regional banks means this isn't a one-off experiment. It's a coordinated network from day one. The banks involved aren't giants. Huntington, M&T, KeyCorp - these are mid-tier regional banks with customers in the single digit millions. But if this works, the big ones will be watching very closely. Onchain banking just went from theory to pilot.
ZKsync@zksync

Today marks a new chapter for U.S. banking. The Cari Network, developed alongside five regional banks, is building a new platform to bring tokenized deposits onchain. Secure. Private. Within the regulatory perimeter. Powered by ZKsync’s Prividium.

English
20
86
349
38K
Alejandro
Alejandro@ContraVibes·
@RyanSAdams Five regional banks tokenizing deposits sounds bullish until you realize the compliance stack alone will cost more than the deposits generate in yield. The real tell is whether they settle to L1 or stay siloed on app-chains that defeat the whole 'superchain' thesis.
English
2
0
1
457
RYAN SΞAN ADAMS - rsa.eth 🦄
There was a thesis that all banks are ledgers and will inevitably become chains connected to the global Ethereum superchain. Over $8 trillion in deposits at regional U.S. banks to bring onchain. Just getting started.
ZKsync@zksync

Today marks a new chapter for U.S. banking. The Cari Network, developed alongside five regional banks, is building a new platform to bring tokenized deposits onchain. Secure. Private. Within the regulatory perimeter. Powered by ZKsync’s Prividium.

English
30
54
488
87.1K
Omar
Omar@Ozhar·
Five US banks are now building a tokenized deposit network powered by ZKsync's Prividium, through the Cari Network led by former Comptroller of the Currency Gene Ludwig. M&T Bank, Huntington, KeyCorp, First Horizon, and Old National are designing infrastructure that enables banks to issue, transfer, and redeem tokenized deposits 24/7, while keeping them on the balance sheet as regulated bank liabilities, FDIC insured, and under existing supervisory oversight. Banks have been exploring stablecoins to keep pace with demand for always-on money movement. But stablecoins, even under the GENIUS Act, are payment instruments, not banking instruments. They don't support credit creation through fractional reserve lending. They carry issuer counterparty risk. And they don't receive the same balance sheet treatment under Basel III as deposits do. Until now, what prevented banks from moving to tokenized deposits was a fundamental infrastructure gap. Choose privacy and institutional control, and you lost interoperability. Choose interoperability on public chains, and you lost privacy and control. The combination of both, along with a neutral, open settlement layer, simply did not exist. Prividium changes that. Anchored to Ethereum, it provides a permissioned environment with full data privacy, regulatory auditability, and interoperability with the broader digital asset ecosystem without compromising institutional controls. We see stablecoins and tokenized deposits playing complementary roles in every bank's onchain strategy. Tokenized deposits protect and grow the balance sheet within a fully private, bank-controlled environment. Stablecoins serve as the bridge when money needs to move beyond those rails. The Cari Network and its five design partner banks represent the first institutions adopting this infrastructure to enable a new era of 24/7 digital money movement.
Omar tweet media
English
5
14
100
5.5K
Omar
Omar@Ozhar·
@andyyy It’s because none of the systems they need to operate can integrate with onchain trading yet. Acccounting, risk, compliance, ops, all these functions have no way to work onchain so far for majority of institutions
English
2
0
3
370
Andy
Andy@andyyy·
This doesn't really make sense to me. Major exchanges like the NYSE are pushing into 24/7 trading, but institutions are pushing back. The sources cited claim resistance to instant settlement due to "prefunding requirements and increased operational costs" associated with enabling 24/7 trading. Retail will likely be the major adopters first, with institutions to follow. Why would you prefer T+1/2 settlement times???
Andy tweet media
English
34
0
59
12.4K
Eddie
Eddie@DancingEddie_·
at some point I gotta delete these dating apps 200+ matches across hinge & raya but none of it working out only thing keeping me in this humiliation ritual is - irl is actually cooked. doesn't work - bars are horrendous - "maybe the next match will be it!" gross
English
111
4
377
48.8K
Omar
Omar@Ozhar·
@MikeIppolito_ sounds like this will be a boon for the audit industry
English
0
0
0
90
Mippo 🟪
Mippo 🟪@MikeIppolito_·
Everyone is assuming SaaS will convert into outcome based pricing. This sounds good, until you get into the weeds of how to calculate an outcome and assign economic value to it. Good luck.
English
3
0
12
1.4K
Omar
Omar@Ozhar·
We have recreated L2 rollups from first principles. Batching transactions together and settling as one transaction to save on gas fees. They’re starting with optimistic batching first too! In two years they’ll implement ZK at this rate
Blessing Adesiji@bleso_a

We just launched Circle Nanopayments on testnet, and I want to explain the core problem this solves. Let’s say you want to charge $0.01 for an API call, especially for agentic use cases. That seems completely reasonable. But the gas fee to process that payment costs $0.005. You just lost half your revenue to infrastructure. Now imagine you want to charge $0.001 per request. The gas fee is still $0.005. You are now losing money on every single transaction. The payment literally costs more than the service itself. This is why pay-per-use models have not worked at scale. The infrastructure economics prevent it from working. The most natural way to price digital services is by usage. - Pay per API call. - Pay per computation. - Pay per data query. These models are what power agentic commerce, where developers and AI agents need a financial rail built for high-frequency, high-volume payments. Nanopayments solves this through batched settlement. Instead of paying gas for every transaction, Circle’s Gateway aggregates thousands of payments and settles them together in a single onchain transaction. The gas cost is distributed across all of those payments. Here is the result: - 1,000 payments that would normally cost $10 in gas fees now cost $0.01 total. - That brings the cost down to $0.00001 per payment instead of $0.01. Gas is no longer the limiting factor. The way it works is straightforward. You deposit USDC into a Gateway Wallet contract once. That is the only time you pay gas. After that, every payment is just a cryptographic signature. You sign a message authorizing the transfer, but nothing is immediately broadcast to the blockchain. The signature is verified instantly, you receive access to the service, and Gateway batches your payment with thousands of others for settlement later. From a developer perspective, you make a request, the server responds with payment details, you sign the authorization, and you immediately receive the resource. The entire interaction happens in milliseconds. There is no need to wait for block confirmations, and there are no individual gas payments for each request. What this enables is true usage-based pricing. - You can charge $0.001 per API call and maintain margin. - You can implement pay-per-token pricing for AI inference. - You can charge per second for streaming services. Pricing models that were economically impractical because of gas costs now become viable. This is particularly important for AI agents that need to autonomously pay for services. An agent cannot operate if every $0.01 payment carries $0.005 in overhead. When payments cost $0.00001 in overhead, the economics make sense. Agents can pay for compute, for data, and for API access, enabling an autonomous agent economy. We are live on testnet today. Developers can start building with gas-free USDC transfers down to $0.000001. This is what makes usage-based pricing viable. developers.circle.com/gateway/nanopa… Reach out to me if you need help building with this.

English
4
3
41
4.9K
Omar
Omar@Ozhar·
@stacy_muur That’s like saying you should only buy stocks if they have good dividends. You don’t buy Tesla or Nvidia or Amazon cause you’re hoping to make money through quarterly dividends, you buy them cause they’re successful businesses that keep growing and so the price keeps growing
English
1
0
1
418
Stacy Muur
Stacy Muur@stacy_muur·
Most crypto tokens are designed backwards. You make money by selling, not by holding. Which means every other holder is your competition from day one. Founders are timing their vesting unlock, investors are timing theirs, and retail is trying to front-run both. Nobody is actually aligned; everyone is just playing musical chairs. The fix isn't complicated in theory, if holders earn by holding rather than selling, the incentive flips. You stop trying to outmaneuver other holders and start trying to grow the protocol. Your competition becomes other protocols, not your own community. The reason it hasn't happened comes down to two things: • Distributing revenue to holders looked too much like an unregistered security under existing law. That legal risk killed the idea before it started for most teams. • The infrastructure to do it cheaply didn't exist. Gas costs on the mainnet Ethereum made programmatic revenue distribution impractical. L2s solved the second problem, and L1 is scaling. Regulation is close to solving the first. The teams paying attention to this now have a real head start. Worth reading the full piece ↓
brian flynn@Flynnjamm

x.com/i/article/2025…

English
47
25
258
50.9K
Omar retweetledi
Andrey Lazorenko
Andrey Lazorenko@AndreyLazorenko·
When building @ADIChain_, we had 3 non-negotiable requirements: 1) institutional compliance 2) EVM compatibility 3) path to private L3s @zkSync's Atlas checked every box. This stack combines a modular architecture with validity proofs fast enough for financial settlement. Institutions get app-specific L3s with sequencer control, a path to private execution, and high TPS with sub-second blocks. While builders get full EVM compatibility and native Ethereum interoperability. The architecture unlocks interconnected L3s that outsource proving, communicate natively, scale horizontally, and support private execution when needed. Another critical factor was a proven team with production experience. We now have a foundation to build what institutions need: scalable rails that connect to the broader Web3 ecosystem.
Andrey Lazorenko tweet media
English
12
15
65
15K