Ankur Banerjee
11.4K posts

Ankur Banerjee
@ankurb
Senior Product Director @zama. Co-founder @cheqd_io. Co-chair of Technical SteerCo @DecentralizedID. Ex @FinTechLabLDN, @inside_r3, @Accenture

Blockscout now supports ERC-7984 confidential tokens built with Fully Homomorphic Encryption by @zama Balances and transfer amounts stay encrypted, while activity remains verifiable onchain 😎 Because privacy and transparency don’t have to be opposites Learn more 👇 blog.blockscout.com/zama-confident…






Data breach at FICOBA (the French national bank account registry) leads to 1.2 million records leaked. 🤯 “…the perpetrator (or perpetrators) obtained login credentials belonging to a civil servant authorized to use the database and then used those credentials to explore its contents.” Why a single civil servant's account should be allowed to access or export 1.2 million records without triggering any data loss protection alarms is a massive red flag. 🚩 If a private company with SOC2 or ISO 27001 certification allowed a single admin to export 1.2mn PII records without triggering an immediate lockout or requiring multi-party authorization, they would fail their audit instantly. Why should government infosec standards lower than most modern startups?! We need to stop "protecting" the perimeter and start making the database blind. - Stop acting as a custodian of data. Move to a model where the government verifies a proof provided by the resident, rather than holding the raw data in a central "honey pot." - With Fully Homomorphic Encryption, e.g. @zama, the government can query the database without ever seeing the raw PII. The data remains encrypted even during computation. Governments have shown time and again they cannot be trusted with centralized databases. It’s time to move from "Trust us, we’re the government" to "Trust the cryptography." helpnetsecurity.com/2026/02/19/fic…



Breaking News: Dozens of former workers at Noma, one of the top-ranked restaurants in the world, say its chef physically abused employees. nyti.ms/3OWQksf


Your confidential app as your daily driver! Powered by $ZAMA protocol. 🔐 Shield your ERC 20 Tokens on your favourite chain 💸 Send confidential payments that nobody can track the amount you sent. 📩 Request payments via QR codes or payment links 👁 Decrypt your balance only when you choose This is how I have bundled the holy grail of cryptography, FHEVM into an app to give you freedom and confidentiality! Follow @zpayyapp and stay tuned. DM if you want to get part of closed testing on Android. iOS is undergoing submission so test flight is coming soon as well.



I believe there’s so much unnecessary confusion, frustration, and debate around age verification because we're treating two very different problems as one: 1/ Stopping kids from seeing inappropriate content 2/ Stopping adults from pretending to be kids I get why these are discussed together. Too much of (2) increases the risk of (1). But not all platforms face both challenges and certainly not at the same level, and treating them as the same problem leads to the wrong solutions and the wrong tradeoffs. I’m not a policy maker, but from our work at Persona, I’ve seen that the challenges, risks, and solutions for each of these problems are wildly different. Keeping kids from inappropriate content is a household-level problem. I don’t want to downplay the risks of social media or exposure to adult content. However, sacrificing broad privacy to solve what is fundamentally a parental controls problem doesn’t feel like a great bargain. Stopping adults from impersonating kids is a platform-level problem. It jeopardizes the safety and integrity of the community and at its core, it's fraud where adults have far more resources than kids. Unfortunately, the challenge is that more effective solutions tend to compromise more privacy. The best approaches evaluate how much of a tradeoff is worthwhile given the risks. When the risks of a technology don’t match the benefits of the problem it solves, public concern is justified. Applying fraud prevention techniques to what should be a parental controls problem is overreach. And a half-baked solution to adult impersonation is possibly worse. It’s security theatre where privacy is sacrificed but minimal assurance is gained. The more I work on this and the more I hear from all of you, the more I believe that if some privacy must be lost, some privacy should be gained elsewhere in return. The right framework is one that splits knowledge to prevent abuse. No single organization should know both: 1/ who you are 2/ what you are doing If Persona has to know who you are, we should make sure we don’t know what you’re doing or what app you’re using. And if a platform knows what you’re doing, they shouldn’t know who you are. This is not where the world is at today, and this framework is by no means perfect. But I think it’s better, and I’d love your feedback as we build it.








Wow, look how convenient






