HilesFiles

5.9K posts

HilesFiles banner
HilesFiles

HilesFiles

@MichaelHiles

These are my own opinions, ✝️, @go10xts, @conduit_network, DePIN, RWA tokenization, Constitutionalist, anti-authoritarian, ΜΟΛΩΝ ΛΑΒΕ, coffee snob

Cincinnati USA Katılım Şubat 2009
4.2K Takip Edilen4.3K Takipçiler
HilesFiles retweetledi
Naval
Naval@naval·
A lot of software is about to get a lot better, right before it becomes unnecessary.
English
873
1.1K
16.2K
696.1K
HilesFiles retweetledi
SOFX
SOFX@SOFXnetwork·
A U.S. Marine Corps release on March 23, 2026 detailed how Lance Cpl. Eirick Schule developed a 3D-printed replacement for the MUOS antenna mast in April 2025, cutting costs from $5,644 per unit to about $10 and reducing delivery time from over 7 months to 10 hours, with 107 units produced and an estimated $600,000 saved.
SOFX tweet media
English
267
1K
8.6K
714K
HilesFiles retweetledi
Jameson Lopp
Jameson Lopp@lopp·
Consider that the dumbest people you know are repeatedly being told "You're absolutely right!" by LLMs.
English
245
2.3K
23.9K
641K
HilesFiles retweetledi
Fight With Memes
Fight With Memes@FightWithMemes·
Whoever did this, I don't want to end up on your bad side. 😳
English
76
2.7K
22.8K
558.8K
HilesFiles retweetledi
God of Prompt
God of Prompt@godofprompt·
🚨 Holy shit… Columbia University just dropped one of the most unsettling papers on AI inference I’ve read in a long time. They proved that the entire private AI inference industry built the wrong thing. Prior methods: encrypt the full transformer. 280GB per query. 60-second latency. Enterprise-grade security theater. GPT, Gemini, Qwen, and Mistral independently converged to nearly identical internal representations. One linear equation connects them. > Sub-second inference. 1MB of communication. Same security guarantees. > The private AI inference problem is real. Hospitals can't send patient data to OpenAI. Banks can't send transaction records to Google. Legal firms can't send case files to Anthropic. The solution the industry built: encrypt everything every layer, every attention head, every weight using homomorphic encryption and secure multi-party computation. The result: 280GB of encrypted communication per query. 60-second latency. > Infrastructure costs that make production deployment practically impossible. > Columbia University found the shortcut everyone missed. The Platonic Representation Hypothesis the observation that large models trained on enough data tend to converge toward a shared statistical understanding of the world turns out to be exploitable. GPT, Gemini, Qwen, Mistral, and Cohere, trained independently on different data with different architectures for different objectives, developed internal representations with CKA similarity scores between 0.595 and 0.881. That's not close. > That's essentially the same space. > If the spaces are the same, you don't need to encrypt the model. You learn a single affine transformation one matrix that maps your model's internal representations into the provider's space. Encrypt that matrix. > Send it. The provider runs one linear classification operation on encrypted data and returns the encrypted prediction. You decrypt locally. The transformer never gets encrypted. The weights never get exposed. The query never leaves your control in readable form. > HELIX is the system they built on this insight. During training, the client encrypts their embeddings from public data and sends them to the provider, who computes the alignment map under encryption and returns it. During inference, the client applies the alignment locally, encrypts the transformed representation, and sends it. The provider applies a linear classifier homomorphically and returns the encrypted prediction. > Multiplicative depth of one. No bootstrapping required. 128-bit security by CKKS standard. → Prior methods communication cost: 280.99GB per query (Iron), 25.74GB (BOLT), 68.6GB (MPCFormer) → HELIX communication cost: less than 1MB per query → Prior methods latency: 20-60+ seconds per query → HELIX latency: sub-second → Cross-model CKA similarity: 0.595 to 0.881 across GPT, Gemini, Qwen, Mistral, Cohere → Text generation quality: 60-70% of single-model baseline for high-compatibility pairs → Tokenizer compatibility predicts generation quality with r=0.898 The finding that should end careers: models above 4B parameters with tokenizer compatibility above 0.7 exact match rate can generate coherent text across model families using only a linear transformation. Qwen encoding. Llama decoding. No fine-tuning. No weight sharing. No data transfer. Just matrix multiplication applied to the boundary between two independently trained systems that accidentally became the same thing.
God of Prompt tweet media
English
42
125
639
61K
HilesFiles retweetledi
Rimsha Bhardwaj
Rimsha Bhardwaj@heyrimsha·
🚨BREAKING: A developer on GitHub just turned your WiFi router into a full-body surveillance system. It's called RuView. It uses the WiFi signals already in your room to detect human poses, track breathing, measure heart rate, and see through walls. Not a concept. Not a research paper. Working code you can run right now. Here's what this thing actually does: → Tracks full 17-point body pose using only WiFi signals → Detects breathing rate (6-30 BPM) without touching anyone → Measures heart rate (40-120 BPM) from across the room → Sees through walls, furniture, and debris up to 5 meters deep → Tracks multiple people simultaneously with zero identity swaps → Self-learns from raw WiFi data. No labeled datasets needed Here's how it works: WiFi signals pass through your room and hit the human body. The body scatters those signals differently based on position, breathing, even heartbeat. RuView reads that scattering pattern and reconstructs everything. A mesh of 4 ESP32 nodes ($48 total) gives you 360-degree coverage with 12 measurement links, 20 Hz updates, and sub-30mm precision. Here's the wildest part: It has a disaster response mode called WiFi-Mat. It detects survivors trapped under rubble through concrete walls, classifies injury severity using START triage protocol, and estimates 3D position. The kind of tool that saves lives after earthquakes. The Rust implementation processes 54,000 frames per second. That's 810x faster than the Python version. The entire Docker image is 132 MB. The AI model fits in 55 KB of memory. Runs on an $8 ESP32 chip. Train once, deploy in any room. No retraining. No recalibration. 1,100+ tests. 15 Rust crates on crates. io. SHA-256 verified capability audit. 100% Open Source.
Rimsha Bhardwaj tweet media
English
85
576
2.8K
198.3K
HilesFiles
HilesFiles@MichaelHiles·
LGA Pre-check @ 7AM
HilesFiles tweet media
Filipino
2
0
2
138
HilesFiles
HilesFiles@MichaelHiles·
3 out of 3 trips on @AmericanAir in March delayed. This was the last one booked on them in advance. Cya AA my last flight on you.
English
2
0
2
1K
HilesFiles
HilesFiles@MichaelHiles·
2015 Dev Community: You should build on @Firebase it's awesome! Me: No way man, @Google has a really terrible habit of killing products. Too much risk. Google in 2026: RIP Firebase
English
0
0
0
31
Julie Barrett
Julie Barrett@juliecbarrett·
🚨America's newest Digital ID proposal just dropped in the US Senate. Republican Senator Marsha Blackburn has introduced a new AI bill that bundles 17 different policies into one massive, 291 page bill. This is just like all these other tech bills we've been seeing - a massive Digital ID framework - with universal age verification being the key to access to the tools. The bill includes mandatory age verification for every existing account and freezing accounts until users verify their age.
Julie Barrett tweet media
English
402
1.6K
2.7K
296.5K
HilesFiles retweetledi
Sentient
Sentient@sentient_agency·
The CIA doesn't want you to find this GitHub repo 👀 It's called Shadowbroker and it aggregates every open-source military signal on Earth into one dashboard. → US Navy carrier strike groups tracked live → Spy satellites color-coded by mission type → GPS jamming zones with severity overlays → 25,000+ ships. 2,000+ CCTV feeds globally Right-click any point on Earth. Get a full intelligence dossier. 100% Opensource. Link in comments.
English
84
816
4K
282.3K
🇺🇸🇻🇦🏴‍☠️🥖✒️
@CR1337 Machines require electric power. What are the 50 hand tools that can rebuild everything from scratch. If from scratch were called for the machines would likely be useless. Unless someone rigged solar generators. There would likely be no centralized or fossil energy sources.
English
3
0
4
1.7K
HilesFiles retweetledi
CR1337
CR1337@CR1337·
"I calculated that civilization needs just 50 machines to build everything from scratch. And what people can't believe, is that I posted the full plans, designs, instructions and how anyone can build these machines for themselves."
CR1337 tweet media
English
287
4.1K
28.6K
1.1M
Peter Van Valkenburgh
Peter Van Valkenburgh@valkenburgh·
If anything's going to boil the oceans, it's the delta between giving chat a long PDF to analyze versus plain text. What the hell. How is the PDF file format so bad that even the sand gods can't figure it out?
Peter Van Valkenburgh tweet media
English
6
2
12
2.2K
HilesFiles
HilesFiles@MichaelHiles·
@berniemoreno Why is our Republican legislature and so-called Republican Governor enabling it?
English
1
0
27
380
Bernie Moreno
Bernie Moreno@berniemoreno·
Why is Ohio handing out millions in tax breaks to a private equity giant for a data center project that will only create 10 jobs? It’s ridiculous and Ohio taxpayers know it. Carlyle should do the right thing and give up the subsidy.
Bernie Moreno tweet mediaBernie Moreno tweet media
English
153
596
1.9K
39.4K
HilesFiles retweetledi
Warren Davidson 🇺🇸
Warren Davidson 🇺🇸@WarrenDavidson·
Dear @SenTedCruz, Jesus Christ is in fact king. King of kings. Lord of lords. The Messiah. Blessings, @WarrenDavidson cc: Jerusalem, Judea, Samaria, and the ends of the earth (Acts 1:8)
English
25
58
474
11.3K
HilesFiles retweetledi
Mark Gadala-Maria
Mark Gadala-Maria@markgadala·
This is wild. 143 million people thought they were catching Pokémon. They were actually building one of the largest real-world visual datasets in AI history. Niantic just disclosed that photos and AR scans collected through Pokémon Go have produced a dataset of over 30 billion real-world images. The company is now using that data to power visual navigation AI for delivery robots. Players didn't just walk around with their phones. They scanned landmarks, storefronts, parks, and sidewalks from every angle, at every time of day, in lighting and weather conditions that staged photography would never capture. They documented the physical world at a scale no mapping company with a fleet of vehicles could have replicated on the same timeline or budget. Niantic collected this systematically, data point by data point, across eight years, while users thought the only thing at stake was catching a rare Charizard. The most valuable AI training datasets in the world aren't being assembled in data centers. They're being built by people who have no idea they're building them.
NewsForce@Newsforce

POKÉMON GO PLAYERS TRAINED 30 BILLION IMAGE AI MAP Niantic says photos and scans collected through Pokémon Go and its AR apps have produced a massive dataset of more than 30 billion real-world images. The company is now using that data to power visual navigation for delivery robots, letting them identify exact locations on city streets without relying on GPS. Source: NewsForce

English
2.2K
24.3K
107.2K
14M