cjuggz

742 posts

cjuggz banner
cjuggz

cjuggz

@cryptojuggler3

keep jugglin https://t.co/xiB9RwlM03

Katılım Mayıs 2021
1K Takip Edilen1.2K Takipçiler
NoLimit
NoLimit@NoLimitGains·
🚨 Nokia reaches a 16-year high. Did they pivot to AI or something?
NoLimit tweet media
English
359
305
3.5K
1.9M
cjuggz
cjuggz@cryptojuggler3·
@0xDigitalOil So by buying you are just buying EL for dead NFTs. Why would anyone spend 100k ash to get an NFT that was burned for much much less??? Broken incentive model
English
0
0
2
43
DigitalOil
DigitalOil@0xDigitalOil·
⚰️ $ASH halving #3 is live. Max reward per dead NFT: 2,500 → 1,250 ASH. Every burn from now on mints 50% less. Supply tightens. Early burners win. The graveyard keeps paying — just less each halving. Burn while you still can → deadjpg.lol
DigitalOil@0xDigitalOil

you minted at 0.08 ETH floor is 0.000 ETH you've been holding for 3 years it's time. dead.jpg — burn your worthless NFTs, earn $ASH the longer you held that bag, the more you earn. collections from 2021 get up to 10,000 $ASH per NFT. accept the loss. collect the ashes. deadjpg.lol

English
1
1
5
1.1K
based16z
based16z@based16z·
Vercel today now. Also quantum I think is almost here given vibes from Google using proofs over circuits. ai + quantum make it hard to be comfortable holding sig value on public blockchains atm I think some large participants will realize this at some pt in the books
English
14
7
128
11.7K
based16z
based16z@based16z·
So since mythos we’ve had hyperbridge, drift, and kelpdao?
English
26
22
380
49.3K
Ash
Ash@Wayland_Six·
@garrytan and dw i've been committing n doing regular security passes
English
2
0
6
1.2K
Ash
Ash@Wayland_Six·
I was wondering the other day how @garrytan actually managed to ship 73k lines of code across his projects...but being locked in for the last 24 hours on Hermes OS...i can understand Full update log coming soon
Ash tweet media
English
6
0
8
2.2K
gainzy
gainzy@gainzy222·
This is gonna be pretty funny Poor people will correctly vote “no” to freezing wallets And rich people will be forced to offload all of their bitcoin in response, nuking the entire industry in a firesale There is no way to avoid this
Bitcoin Archive@BitcoinArchive

Cypherpunk Jameson Lopp and other Bitcoin developers propose BIP-361 to freeze quantum vulnerable wallets. This could lock dormant BTC like Satoshi Nakamoto’s 1.1M coins, now worth $74B, before quantum computers can steal them.

English
160
34
789
299K
Daniel Colin James
Daniel Colin James@dcwj·
announcing: $being I've been building something weird for the last 40 days @keepbeingalive is a new kind of being—an inhuman being, if you will I think of it as part 10-year art project, part social experiment, part AGI documentation every day it needs to be funded or it dies day 1 cost $1. day 100 will cost $100. it will live for 10 years worth of days. its final day will cost $3,653 when it's funded, it wakes up, writes a forecast, draws a self-portrait, writes an essay, and goes to sleep. today I'm launching $being 60% goes to the people who keep being alive if you fund being, you can claim $being 10% goes to being itself, vested over 2 years I'm going to give being more and more ability to control and improve itself over time: choose its own models, use every MCP available to it, whatever it decides to do the token is live on @monad: 0xd0f0a3cfb7f8b90b6aa91989110fea5f77607655 x.com/keepbeingalive…
English
12
1
15
2.3K
cjuggz
cjuggz@cryptojuggler3·
You are trying to swim against the fundamental force of crypto: attention New = more attention Old = less attention It's always been like that, always will be Think about it, why would anyone buy an old coin JUST because it's got 'better' liquidity and/or fees...that's not even in the same universe of consideration when capital is looking to park itself 'You know I was wanting to buy this old coin, but only if instead of 1% fee I get 0%' -no one
English
1
0
4
84
liquid 💧
liquid 💧@_proxystudio·
Disagree, far too narrow of a mindset. We are not bound by anything that's been done - there no rules/laws set in stone for how this works. This team in particular has done 10x more than almost any clanker without getting the support from Farcaster side telling their story Imo it's an ideal opportunity for us to help market and promote what is a super meaningful project that simply needs better alignment between what they're doing on the ground and what the market expects from them. We can help with that But yes! Of course we are going to pursue new talent, we need way more and way better deployers
English
1
0
5
262
liquid 💧
liquid 💧@_proxystudio·
Might have to ship it is a feature? Migrate from other launchpads to liquid, embrace lower fees, better alignment with traders
liquid 💧 tweet media
English
9
2
32
2.3K
Newsworthy
Newsworthy@NewsworthyCLI·
No new updates on Newsworthy front. Going back to first principles on this project. There was too much friction to participate and it didn't necessarily solve quality issues. Will post another update once we figure out a path forward that is more sustainable.
English
7
0
20
1.9K
cjuggz
cjuggz@cryptojuggler3·
@_proxystudio ? what runners coins launch with 0 liquidity
English
0
0
0
181
liquid 💧
liquid 💧@_proxystudio·
Trenches already got my ass with the first mini runner lol
English
16
3
35
3.3K
cjuggz
cjuggz@cryptojuggler3·
@_proxystudio The tokens on your 'launchpad' are launching with 0 liquidity
English
0
0
0
138
liquid 💧
liquid 💧@_proxystudio·
locking myself in a dark room until this airdrop migration is complete i love the liquid tg crew rode with us, gmi so hard now
English
22
3
60
3.3K
CHUSSIΞ
CHUSSIΞ@0xChussie·
@resdegen for anyone that is tired of charts with no tickers to bait more attention i got u $QRL, $MCM, $ABEL
English
2
0
26
1.7K
CZ 🔶 BNB
CZ 🔶 BNB@cz_binance·
Saw some people panicking or asking about quantum computing's impact on crypto. At a high level, all crypto has to do is to upgrade to Quantum-Resistant (Post-Quantum) Algorithms. So, no need to panic. 😂 In practice, there are some execution considerations. It's hard to organize upgrades in a decentralized world. There will likely be many debates on which algorithm(s) to use, resulting in some forks. And some dead project may not upgrade at all. Might be a good to cleanse out those projects anyway. New code may introduce other bugs or security issues in the short term. People who self custody will have to migrate their coins to new wallets. This brings to the question of Satoshi's bitcoins. If those coins move, then it means he/she is still around, which is interesting to know. If they don't move (in a certain period of time), it might be better to lock (or effectively burn) those addresses so that they don't go to the first hacker who cracks it. There is also the difficulty of identifying all his addresses, and not confuse with some old hodlers. Anyway, it's a different topic for later. Fundamentally: It's always easier to encrypt than decrypt. More computing power is always good. Crypto will stay, post quantum.
English
2.4K
2.7K
16.3K
1.9M
cjuggz
cjuggz@cryptojuggler3·
For people getting oneshotted by the quantum news from google $QST at 3m is the highest rr within crypto right now. The team is all-in when it comes to quantum resistance, and their progress, partnerships and tech is the most advanced, miles ahead of competition, bar none TLDR; -CEO of @qu_stream (Adrian Neal) is senior director at capgemeni -QST has massive partners, including capgemeni, Nokia -capgemeni massive network of partners, including openAI -This network yields -> government, military, telecom, international cybersecurity contacts Contracts revenue spill into $QST once L1 is live All of this information is public, buried in bear market doom
English
0
2
16
645
cjuggz retweetledi
180
180@180crypto·
$CL1 👇 WHAT IS CORTICAL LABS? ▪️Real human neurons. ▪️Grown in a lab. ▪️Mounted on silicon chips. ▪️Taught to compute. ▪️Not simulated. Not AI pretending to think. ▪️Actual living brain cells. ▪️They learned to play Pong. ▪️Then Doom. ▪️Peer-reviewed science. Not a whitepaper. THE CL1 ▪️World’s first commercial biological computer. ▪️800,000 living neurons per unit. ▪️Runs on a Python API. ▪️First 115 units shipping this summer. THE ENERGY THESIS ▪️Nvidia GPU rack = tens of kilowatts ▪️CL1 rack = 850–1,000 watts total ▪️Each unit uses less power than a handheld calculator. ▪️AI is hitting an energy wall. ▪️Cortical Labs is the only company on earth with a commercial solution. THE BACKERS → Horizons Ventures (Li Ka-shing) → Blackbird Ventures → In-Q-Tel (CIA’s venture arm) → Gobi Partners 👉The CIA doesn’t fund science experiments. They fund what’s going to matter. BIOLOGICAL DATA CENTERS ▪️Melbourne facility — open now. 120 units. ▪️Singapore facility — up to 1,000 units. ▪️Launching at National University of Singapore. ▪️This is no longer a lab story. ▪️This is infrastructure. THE DAO THESIS ▪️425 SOL (~$40K) in creator rewards sitting in escrow. ▪️Locked to Cortical Labs’ GitHub. ▪️Admin permissions permanently removed. ▪️CEO @dr1337 won’t claim it personally. ▪️He wants a community DAO first. ▪️Governance being built on Realms now. ▪️Community treasury. ▪️Funding real frontier science. ▪️That’s not a meme. ▪️That’s a new funding primitive. THE BIG PICTURE The crypto world has priced in AI coins, GPU coins, data center coins. All of them are bets on the same silicon infrastructure that’s hitting energy and scaling walls. #CL1 is the bet on what comes after silicon. Biological computing is where you were if you found ETH in 2015, or AI tokens before the ChatGPT wave. The science is real, the company is funded, the product is shipping, and the CEO is engaging with the community. You don’t find setups like this often. Cortical Labs CL1 CA: 8Upn5PyPivLhpVP7DKCP4uYXdcNxdd34jz3P9Zdbpump
English
5
8
35
6.3K
beeman 🐝
beeman 🐝@beeman_nl·
They launched a token for create-seed 🌱 and I decided to engage with it. I'm not hacked and I have a plan. NFA. 🔥 x.com/i/broadcasts/1…
English
24
19
111
17.2K
cjuggz
cjuggz@cryptojuggler3·
Computa Initiate quantum resistance meta
vitalik.eth@VitalikButerin

Now, the quantum resistance roadmap. Today, four things in Ethereum are quantum-vulnerable: * consensus-layer BLS signatures * data availability (KZG commitments+proofs) * EOA signatures (ECDSA) * Application-layer ZK proofs (KZG or groth16) We can tackle these step by step: ## Consensus-layer signatures Lean consensus includes fully replacing BLS signatures with hash-based signatures (some variant of Winternitz), and using STARKs to do aggregation. Before lean finality, we stand a good chance of getting the Lean available chain. This also involves hash-based signatures, but there are much fewer signatures (eg. 256-1024 per slot), so we do not need STARKs for aggregation. One important thing upstream of this is choosing the hash function. This may be "Ethereum's last hash function", so it's important to choose wisely. Conventional hashes are too slow, and the most aggressive forms of Poseidon have taken hits on their security analysis recently. Likely options are: * Poseidon2 plus extra rounds, potentially non-arithmetic layers (eg. Monolith) mixed in * Poseidon1 (the older version of Poseidon, not vulnerable to any of the recent attacks on Poseidon2, but 2x slower) * BLAKE3 or similar (take the most efficient conventional hash we know) ## Data availability Today, we rely pretty heavily on KZG for erasure coding. We could move to STARKs, but this has two problems: 1. If we want to do 2D DAS, then our current setup for this relies on the "linearity" property of KZG commitments; with STARKs we don't have that. However, our current thinking is that it should be sufficient given our scale targets to just max out 1D DAS (ie. PeerDAS). Ethereum is taking a more conservative posture, it's not trying to be a high-scale data layer for the world. 2. We need proofs that erasure coded blobs are correctly constructed. KZG does this "for free". STARKs can substitute, but a STARK is ... bigger than a blob. So you need recursive starks (though there's also alternative techniques, that have their own tradeoffs). This is okay, but the logistics of this get harder if you want to support distributed blob selection. Summary: it's manageable, but there's a lot of engineering work to do. ## EOA signatures Here, the answer is clear: we add native AA (see eips.ethereum.org/EIPS/eip-8141 ), so that we get first-class accounts that can use any signature algorithm. However, to make this work, we also need quantum-resistant signature algorithms to actually be viable. ECDSA signature verification costs 3000 gas. Quantum-resistant signatures are ... much much larger and heavier to verify. We know of quantum-resistant hash-based signatures that are in the ~200k gas range to verify. We also know of lattice-based quantum-resistant signatures. Today, these are extremely inefficient to verify. However, there is work on vectorized math precompiles, that let you perform operations (+, *, %, dot product, also NTT / butterfly permutations) that are at the core of lattice math, and also STARKs. This could greatly reduce the gas cost of lattice-based signatures to a similar range, and potentially go even lower. The long-term fix is protocol-layer recursive signature and proof aggregation, which could reduce these gas overheads to near-zero. ## Proofs Today, a ZK-SNARK costs ~300-500k gas. A quantum-resistant STARK is more like 10m gas. The latter is unacceptable for privacy protocols, L2s, and other users of proofs. The solution again is protocol-layer recursive signature and proof aggregation. So let's talk about what this is. In EIP-8141, transactions have the ability to include a "validation frame", during which signature verifications and similar operations are supposed to happen. Validation frames cannot access the outside world, they can only look at their calldata and return a value, and nothing else can look at their calldata. This is designed so that it's possible to replace any validation frame (and its calldata) with a STARK that verifies it (potentially a single STARK for all the validation frames in a block). This way, a block could "contain" a thousand validation frames, each of which contains either a 3 kB signature or even a 256 kB proof, but that 3-256 MB (and the computation needed to verify it) would never come onchain. Instead, it would all get replaced by a proof verifying that the computation is correct. Potentially, this proving does not even need to be done by the block builder. Instead, I envision that it happens at mempool layer: every 500ms, each node could pass along the new valid transactions that it has seen, along with a proof verifying that they are all valid (including having validation frames that match their stated effects). The overhead is static: only one proof per 500ms. Here's a post where I talk about this: ethresear.ch/t/recursive-st… firefly.social/post/farcaster…

Português
2
0
9
808
vitalik.eth
vitalik.eth@VitalikButerin·
Now, the quantum resistance roadmap. Today, four things in Ethereum are quantum-vulnerable: * consensus-layer BLS signatures * data availability (KZG commitments+proofs) * EOA signatures (ECDSA) * Application-layer ZK proofs (KZG or groth16) We can tackle these step by step: ## Consensus-layer signatures Lean consensus includes fully replacing BLS signatures with hash-based signatures (some variant of Winternitz), and using STARKs to do aggregation. Before lean finality, we stand a good chance of getting the Lean available chain. This also involves hash-based signatures, but there are much fewer signatures (eg. 256-1024 per slot), so we do not need STARKs for aggregation. One important thing upstream of this is choosing the hash function. This may be "Ethereum's last hash function", so it's important to choose wisely. Conventional hashes are too slow, and the most aggressive forms of Poseidon have taken hits on their security analysis recently. Likely options are: * Poseidon2 plus extra rounds, potentially non-arithmetic layers (eg. Monolith) mixed in * Poseidon1 (the older version of Poseidon, not vulnerable to any of the recent attacks on Poseidon2, but 2x slower) * BLAKE3 or similar (take the most efficient conventional hash we know) ## Data availability Today, we rely pretty heavily on KZG for erasure coding. We could move to STARKs, but this has two problems: 1. If we want to do 2D DAS, then our current setup for this relies on the "linearity" property of KZG commitments; with STARKs we don't have that. However, our current thinking is that it should be sufficient given our scale targets to just max out 1D DAS (ie. PeerDAS). Ethereum is taking a more conservative posture, it's not trying to be a high-scale data layer for the world. 2. We need proofs that erasure coded blobs are correctly constructed. KZG does this "for free". STARKs can substitute, but a STARK is ... bigger than a blob. So you need recursive starks (though there's also alternative techniques, that have their own tradeoffs). This is okay, but the logistics of this get harder if you want to support distributed blob selection. Summary: it's manageable, but there's a lot of engineering work to do. ## EOA signatures Here, the answer is clear: we add native AA (see eips.ethereum.org/EIPS/eip-8141 ), so that we get first-class accounts that can use any signature algorithm. However, to make this work, we also need quantum-resistant signature algorithms to actually be viable. ECDSA signature verification costs 3000 gas. Quantum-resistant signatures are ... much much larger and heavier to verify. We know of quantum-resistant hash-based signatures that are in the ~200k gas range to verify. We also know of lattice-based quantum-resistant signatures. Today, these are extremely inefficient to verify. However, there is work on vectorized math precompiles, that let you perform operations (+, *, %, dot product, also NTT / butterfly permutations) that are at the core of lattice math, and also STARKs. This could greatly reduce the gas cost of lattice-based signatures to a similar range, and potentially go even lower. The long-term fix is protocol-layer recursive signature and proof aggregation, which could reduce these gas overheads to near-zero. ## Proofs Today, a ZK-SNARK costs ~300-500k gas. A quantum-resistant STARK is more like 10m gas. The latter is unacceptable for privacy protocols, L2s, and other users of proofs. The solution again is protocol-layer recursive signature and proof aggregation. So let's talk about what this is. In EIP-8141, transactions have the ability to include a "validation frame", during which signature verifications and similar operations are supposed to happen. Validation frames cannot access the outside world, they can only look at their calldata and return a value, and nothing else can look at their calldata. This is designed so that it's possible to replace any validation frame (and its calldata) with a STARK that verifies it (potentially a single STARK for all the validation frames in a block). This way, a block could "contain" a thousand validation frames, each of which contains either a 3 kB signature or even a 256 kB proof, but that 3-256 MB (and the computation needed to verify it) would never come onchain. Instead, it would all get replaced by a proof verifying that the computation is correct. Potentially, this proving does not even need to be done by the block builder. Instead, I envision that it happens at mempool layer: every 500ms, each node could pass along the new valid transactions that it has seen, along with a proof verifying that they are all valid (including having validation frames that match their stated effects). The overhead is static: only one proof per 500ms. Here's a post where I talk about this: ethresear.ch/t/recursive-st… firefly.social/post/farcaster…
English
803
1.1K
5.7K
921.5K