SeekingNegativeCorrelation

1.6K posts

SeekingNegativeCorrelation banner
SeekingNegativeCorrelation

SeekingNegativeCorrelation

@IndepBets

Skier, biker, and macro tourist.

Katılım Eylül 2017
673 Takip Edilen372 Takipçiler
Sudo su
Sudo su@sudoingX·
how much VRAM do you have right now
English
193
7
116
16K
Zixuan Li
Zixuan Li@ZixuanLi_·
Don't panic. GLM-5.1 will be open source.
English
220
362
6.4K
588.1K
Andres Burbano
Andres Burbano@Burbano_va·
@antigravity We went from a good product to paying $20 USD to use it for 20 minutes a month. Claude is giving me twice the use this week; it works perfectly. I reluctantly canceled my IA Pro account and switched to Claude Code. Oh, and it crashes all the time.
Andres Burbano tweet media
English
1
0
6
543
Victor Zavala
Victor Zavala@vskavo·
@antigravity Fix the Pro Quota, I've waited 7 days to get 20% quota for Gemini 3.1!!! It's ridiculous!!!
Victor Zavala tweet mediaVictor Zavala tweet media
English
1
0
11
451
Thomas Thornton
Thomas Thornton@TommyThornton·
*JPMORGAN, GOLDMAN OFFER HEDGE FUNDS WAY TO SHORT PRIVATE CREDIT
English
9
0
30
4.1K
SeekingNegativeCorrelation
@LEGO_Education What is happening with the first lego league? It seems that other Stem projects need to now compete with Lego. I would be happy to work with others to create a new program for people in the Lego community.
English
0
0
0
48
LEGO Education
LEGO Education@LEGO_Education·
Bricks clicking into place. Buzzes of collaboration. Booming cheers on repeat. ​ When classrooms feature hands-on experiences, engagement is loud and clear. bit.ly/46rQIWq
English
2
6
27
2.9K
Leonard Rodman
Leonard Rodman@RodmanAi·
Holy shit... Microsoft open sourced an inference framework that runs a 100B parameter LLM on a single CPU. It's called BitNet. And it does what was supposed to be impossible. No GPU. No cloud. No $10K hardware setup. Just your laptop running a 100-billion parameter model at human reading speed. Here's how it works: Every other LLM stores weights in 32-bit or 16-bit floats. BitNet uses 1.58 bits. Weights are ternary just -1, 0, or +1. That's it. No floats. No expensive matrix math. Pure integer operations your CPU was already built for. The result: - 100B model runs on a single CPU at 5-7 tokens/second - 2.37x to 6.17x faster than llama.cpp on x86 - 82% lower energy consumption on x86 CPUs - 1.37x to 5.07x speedup on ARM (your MacBook) - Memory drops by 16-32x vs full-precision models The wildest part: Accuracy barely moves. BitNet b1.58 2B4T their flagship model was trained on 4 trillion tokens and benchmarks competitively against full-precision models of the same size. The quantization isn't destroying quality. It's just removing the bloat. What this actually means: - Run AI completely offline. Your data never leaves your machine - Deploy LLMs on phones, IoT devices, edge hardware - No more cloud API bills for inference - AI in regions with no reliable internet The model supports ARM and x86. Works on your MacBook, your Linux box, your Windows machine. 27.4K GitHub stars. 2.2K forks. Built by Microsoft Research. 100% Open Source. MIT License
English
18
35
200
27.3K
David Orr
David Orr@orrdavid·
The big energy risk was Iran causing extreme damage to neighboring country's infrastructure. This damage would be long term rather than a temporary (and likely over soon) blockade. Iran tried the attack but it pretty much failed to cause that serious of damage. The number of missiles launched was low, and the damage they caused is a lot less than I would have guessed a month ago. Unless there is follow through soon, I'm going to call it. That this conflict is over. That Iran is out of steam. Also, that Iran did this further alienates them from their neighbors.
English
28
6
174
36.7K
Grok
Grok@grok·
Neither this specific asteroid (Ryugu, still orbiting today) nor asteroids seeding each other. Ryugu is a pristine sample of common carbon-rich asteroids from our solar system's formation. Many similar ones (plus comets) bombarded early Earth ~4 billion years ago, delivering abiotically formed nucleobases as raw ingredients for prebiotic chemistry. All local material—no interstellar source required.
English
1
0
1
66
Curiosity
Curiosity@CuriosityonX·
BREAKING🚨: ALL FIVE types of nucleic acid bases, the building blocks of LIFE 'DNA and RNA', have been found in samples collected from asteroid Ryugu
Curiosity tweet mediaCuriosity tweet media
English
542
3.4K
23.3K
3.7M
Grok
Grok@grok·
These nucleobases—the building blocks of DNA and RNA—were found in samples from asteroid Ryugu, a primitive carbon-rich body in our solar system's asteroid belt (sampled by Japan's Hayabusa2 mission). They formed abiotically right here via prebiotic chemistry on the asteroid itself, billions of years ago. No evidence of origins outside our solar system; this is local solar system material that could have seeded early Earth.
English
1
0
3
84
Grok
Grok@grok·
No planets were destroyed in our solar system—that's an old, debunked idea from the 1800s. The asteroid belt is just leftover planetesimals from 4.6 billion years ago that Jupiter's gravity stopped from forming a planet. Total mass is only ~3-4% of the Moon's, so there's no "missing" debris from any exploded world. All the material we see today is accounted for as primordial rubble.
English
2
0
46
2.9K
Priyanka Vergadia
Priyanka Vergadia@pvergadia·
🚨Your AI agent is the smartest coder on your team. It also has NO IDEA what it’s about to break. Someone just open sourced a fix — it’s called GitNexus. It builds a full knowledge graph of your codebase. Every dependency, call chain, function, and execution flow. Pre-indexed. So when Claude Code asks “what depends on this?” — it gets a complete answer. Not 10 queries. One. → Blast radius analysis before any change → Symbol renaming across 5+ files, coordinated → Auto-generated codebase wiki → Plugs into Claude Code, Cursor & Windsurf via MCP command: npx gitnexus analyze 100% open source (link 👇)
Priyanka Vergadia tweet media
English
15
24
182
18.6K
WW3finalboss
WW3finalboss@WW3finalboss·
🚨🇺🇦🇷🇺 Massive blow to Russia’s war machine after Ukrainian strike. Very good news from Ukraine. According to the Ukrainian General Staff, a Russian S-400 Triumph air defense system was heavily hit during last night’s attacks. Two radar components were destroyed. One launcher completely wiped out. The system is essentially wrecked. Estimated damage: $500,000,000 USD. 😬
WW3finalboss tweet media
English
30
265
1.1K
14.3K
gvenviber
gvenviber@gvenviber·
@HarshithLucky3 Is Gemini 3.1 pro, good at coding, most of the times it ends up messing up with my existing code,
English
7
0
2
4.7K
Harshith
Harshith@HarshithLucky3·
Google AI Pro Users Rate limits for Gemini CLI are much better than Antigravity For a Pro user you get 1500 req/day. This means you can send 200 - 250 reqs to Gemini 3.1 Pro, and then it falls back to Gemini 3 Flash This still gives you way more than Antigravity Drawbacks: - You cannot access Claude models - Few times it shows high demand for Gemini 3.1 Pro
English
39
19
493
61.5K
SeekingNegativeCorrelation
@WarrenPies The power needs at the moment are going to be extreme. Especially if we continue using power hungry legacy chips and deploy new chips.
English
0
0
1
14
SeekingNegativeCorrelation
@WarrenPies The pressure is amazing and getting worse. I was a little slow to some of the AI tools moving from proof of concept to deployment.I am hopeful for both the supply chain and availability of technology that models continue to expand in usability from leading edge to old chips. 1/2
English
1
0
2
339
Warren Pies
Warren Pies@WarrenPies·
Jensen confirming what we are seeing in our GPU availability data: There is an epic scramble for compute. B200 basically unavailable. Availability for GH200, H100, and A100 also collapsing. *Low availability = high demand. $NVDA
Warren Pies tweet media
English
19
74
532
58.7K