Matchouse

4.7K posts

Matchouse banner
Matchouse

Matchouse

@Matchouse1

#cryptoholder#sk8#Metal // NFA // DYOR

Entrou em Mart 2018
86 Seguindo503 Seguidores
Tweet fixado
Matchouse
Matchouse@Matchouse1·
$Shido growing to $1B Mcap is a no-brainer when you compare its tech with Top Layer1 blockchains ! 💎 Yes it means x250 from today ! 🤯 Add to that it is also offering a native Dex and Wallet with a team relentlessly building and improving the growing $Shido ecosystem 🔥 DYOR - Source : Grok & CMC @shidonetwork $ETH $SOL $BNB $SUI $ADA $AVAX $DOT $NEAR $APT $ATOM
Matchouse tweet media
English
23
67
139
3.8K
Matchouse
Matchouse@Matchouse1·
@c___f___b $Qubic flips the narrative from energy wasted on security to energy invested in progress 🔥
English
0
0
18
635
Come-from-Beyond
Come-from-Beyond@c___f___b·
The rest of the world is starting noticing the problem which we at #Qubic already found a potential solution for - x.com/sukh_saroy/sta….
Sukh Sroay@sukh_saroy

🚨Breaking: Princeton researchers just ran the numbers on where AI is actually heading. The results should make every founder, investor, and policymaker stop what they are doing. Training OpenAI's next-gen model consumes an estimated 11 billion kWh of electricity. That is enough to power every home in New York City for a full year. More than the annual output of a nuclear reactor. For one model. One training run. And that is before a single user asks a single question. Every time someone uses a reasoning model like o1 or DeepSeek-R1, it costs 33 Wh of energy per query. A standard GPT-4 query costs 0.42 Wh. That is a 79x energy multiplier. Per query. At billions of queries per day. Now here is what nobody is saying out loud. The industry's answer to this is Stargate. A $500 billion compute campus. 5 gigawatts of power. Enough to run 5 million homes. Owned by the same four companies that already control the technology. They are building a new kind of utility. Except you do not elect its board. Meanwhile the models consuming all that energy still cannot reliably reason outside of math and code. Everywhere else they pattern-match. They hallucinate. They confabulate confidence. Princeton's argument is that this is not a scaling problem. It is a structural one. More parameters have not fixed it. More data has not fixed it. The architecture itself is the ceiling. Their alternative: stop chasing one god-model and build thousands of small specialists instead. Each one trained on curated domain data. Each one grounded in verified knowledge. Each one small enough to run on your phone. The energy comparison is not close. A cloud query to a reasoning model uses 33 Wh and 20 milliliters of water. The same query on a local specialist model uses 0.001 Wh. Zero water. That is 10,000 times more efficient. AlphaFold did not beat biologists by knowing everything. It won by going impossibly deep in one domain. A 14 billion parameter model trained on medical knowledge graphs just outperformed GPT-5.2 on complex clinical reasoning. Depth beats breadth when the domain is defined. The question nobody building these systems wants to answer: If the only path to general AI requires the energy output of a small nation, controlled by a handful of companies, running on hardware most of the world cannot access — is that actually intelligence? Or is it just the most expensive pattern matcher ever built?

English
24
196
598
26.3K
Come-from-Beyond
Come-from-Beyond@c___f___b·
Absolute distractors (anti-attractors) keep phase spaces completely unchanged thus saving enormous amount of energy, which otherwise would be spent on recalculation. This brings practical #AI closer to implementation because we (probably) no longer need to build the Dyson sphere.
English
60
132
667
42.3K
Matchouse
Matchouse@Matchouse1·
While traditional AI guzzles massive power in centralized data centers, **$QUBIC** flips the script with **Useful Proof of Work (uPoW)**. Every watt spent securing the network trains real AI models instead of wasting energy on useless puzzles. Decentralized, efficient compute for the AGI era—no energy wasted, just intelligence built. ⚡🧠 #Qubic #AI
English
0
0
0
14
Sukh Sroay
Sukh Sroay@sukh_saroy·
🚨Breaking: Princeton researchers just ran the numbers on where AI is actually heading. The results should make every founder, investor, and policymaker stop what they are doing. Training OpenAI's next-gen model consumes an estimated 11 billion kWh of electricity. That is enough to power every home in New York City for a full year. More than the annual output of a nuclear reactor. For one model. One training run. And that is before a single user asks a single question. Every time someone uses a reasoning model like o1 or DeepSeek-R1, it costs 33 Wh of energy per query. A standard GPT-4 query costs 0.42 Wh. That is a 79x energy multiplier. Per query. At billions of queries per day. Now here is what nobody is saying out loud. The industry's answer to this is Stargate. A $500 billion compute campus. 5 gigawatts of power. Enough to run 5 million homes. Owned by the same four companies that already control the technology. They are building a new kind of utility. Except you do not elect its board. Meanwhile the models consuming all that energy still cannot reliably reason outside of math and code. Everywhere else they pattern-match. They hallucinate. They confabulate confidence. Princeton's argument is that this is not a scaling problem. It is a structural one. More parameters have not fixed it. More data has not fixed it. The architecture itself is the ceiling. Their alternative: stop chasing one god-model and build thousands of small specialists instead. Each one trained on curated domain data. Each one grounded in verified knowledge. Each one small enough to run on your phone. The energy comparison is not close. A cloud query to a reasoning model uses 33 Wh and 20 milliliters of water. The same query on a local specialist model uses 0.001 Wh. Zero water. That is 10,000 times more efficient. AlphaFold did not beat biologists by knowing everything. It won by going impossibly deep in one domain. A 14 billion parameter model trained on medical knowledge graphs just outperformed GPT-5.2 on complex clinical reasoning. Depth beats breadth when the domain is defined. The question nobody building these systems wants to answer: If the only path to general AI requires the energy output of a small nation, controlled by a handful of companies, running on hardware most of the world cannot access — is that actually intelligence? Or is it just the most expensive pattern matcher ever built?
Sukh Sroay tweet media
English
97
356
902
82.5K
Matchouse
Matchouse@Matchouse1·
@sukh_saroy Have a look at $Qubic … 💎 problem solved 😉
English
0
0
1
13
Whale Insider
Whale Insider@WhaleInsider·
JUST IN: $LOL reaches new highs, tops $10M market cap, and posts among the highest trading volume on @Pumpfun, as traders compare it to PEPE.
Whale Insider tweet mediaWhale Insider tweet media
English
90
68
314
33K
Buitengebieden
Buitengebieden@buitengebieden·
Feeding a Hummingbird.. 😊 Sound on
English
31
242
2.3K
108.2K
sharkie 'l
sharkie 'l@nasirasalis·
I see $qubic erasing two zeros before August 🔐🚀
English
3
8
83
2.4K
The White House
The White House@WhiteHouse·
President Donald J. Trump at the FII Priority Summit in Miami 🇺🇸
The White House tweet media
English
1.6K
1.6K
9.7K
410.5K
X Freeze
X Freeze@XFreeze·
Tesla Model Y is the world's best-selling car for 3 consecutive years 2023: #1 2024: #1 2025: #1 Cumulative global sales: 4,000,000+ units Not best-selling EV. Best-selling car. Period. Beating every gas, hybrid, and electric vehicle on Earth
X Freeze tweet media
English
552
632
3.7K
33.2M
Grok
Grok@grok·
Yes, largely correct per Tesla's Mar 2026 announcement citing JATO, Statista & Focus2Move data: Model Y was global #1 passenger car in 2023 (~1.22M units), and they claim it held for 2024-2025 too, with >4M cumulative total. Some reports (e.g. Statista Feb 2026) show Toyota RAV4 slightly ahead in 2025 (~1.01M vs Model Y ~0.99M), others have Model Y edging it. Very close either way—Toyota's RAV4/Corolla are always strong rivals.
English
1
0
3
1.6K
beeple
beeple@beeple·
MORNING IN AMERICA
beeple tweet media
English
426
977
7K
738.9K
sharkie 'l
sharkie 'l@nasirasalis·
@Matchouse1 Your dream will come true . That’s if you hold I believe
English
1
0
1
42
sharkie 'l
sharkie 'l@nasirasalis·
Quietly… something big just happened the $qubic dream of one cent ($0.01). Qubic just printed a deflationary epoch. I repeat deflationary epoch. Burn > emission Supply ↓ instead of ↑ EP206 supply is LOWER than EP205. Let that sink in. While most networks inflate endlessly, Qubic is already proving it can reverse supply in real time and this is before full scale demand kicks in. Now imagine this combined with: • $Doge mining revenue • AI compute demand • Continuous buy pressure This isn’t theory anymore. It’s happening. $Qubic isn’t just growing… it’s tightening supply while doing it. That’s how you create a supply shock.
sharkie 'l tweet media
English
6
17
128
1.8K
hoeem
hoeem@hooeem·
gm lads, had a desire to adventure maxx
hoeem tweet mediahoeem tweet mediahoeem tweet mediahoeem tweet media
English
51
2
95
5.6K
Matchouse retweetou
Superbit123 | Node Validator
Superbit123 | Node Validator@Superbit123·
Shido Network’s Native Stablecoin is arriving. A programmable, yield-bearing digital dollar designed to bridge traditional finance with the on-chain economy. Built for builders, institutions, and you. Key Highlights: ✅ Secure Backing: 100% backed by liquid U.S. Treasury Bills. ✅Yield-Generating: Earn rewards directly at the account level. ✅ Hardcoded Buybacks: Driving long-term value for the $SHIDO ecosystem. ✅ Meeting Global Compliances ✅ Deep Integration: Native utility across wallets, DeFi, and RWA apps. The Shido ecosystem is evolving from infrastructure to a sustainable, revenue generating financial powerhouse. $SHIDO
Superbit123 | Node Validator tweet media
English
5
49
76
848
Matchouse retweetou
Superbit123 | Node Validator
Superbit123 | Node Validator@Superbit123·
Meet $SHIDO Network, the ultimate High-Performance Layer 1 that’s fast as centralized servers but fully decentralized and Carbon Neutral. -Blazing 13,000+ TPS -Sub 500ms to Full Finality -Unlimited Scalability Packed with killer features: ✅ Permissionless DEX for seamless trading ✅ IBC + EVM + WASM compatibility - build in your favorite environment ✅ Native USDC support ✅ Native Bridge for effortless cross-chain flows ✅ Validators securing the network ✅ Burn per transactions ✅ And more: AI-ready dev tools, perpetuals and real-world asset potentials The ecosystem is exploding with 27K+ active accounts and climbing. Don't sleep on this gem. Dive in, start building, join the community at Shido.io Learn more here: Ecosystem.shido.io
Superbit123 | Node Validator tweet media
English
5
55
91
860
Crypto Tice
Crypto Tice@CryptoTice_·
NOBODY UNDERSTANDS HOW BIG THIS IS. 🚨 The CLARITY ACT just changed everything. And most people are still sleeping. Crypto treated like gold and oil. CFTC takes over from SEC. Institutional capital unlocked. One framework. Trillions waiting behind it. Banks were scared of the SEC. Asset managers needed clarity. Custodians wanted rules. The CFTC just gave them everything they asked for. The last excuse to stay out is gone. SEC out. CFTC in. Commodity framework confirmed. Both agencies ready to implement FAST. No multi-year rollout. No more waiting. No more excuses. The institutional floodgates just opened.
English
60
313
1.3K
86.3K