Post

BlackMaps 🗺️
BlackMaps 🗺️@maps_black·
Donaciones que recibí desde cada país :p (PayPal) $240 Guatemala 🇬🇹 $38,52 México 🇲🇽 $34,90 Argentina 🇦🇷 $32 República Dominicana 🇩🇴 $30 Cuba 🇨🇺 $25 Panamá 🇵🇦 $18,91 Brasil 🇧🇷 $16,00 Chile 🇨🇱 $15,00 Venezuela 🇻🇪 $14,66 Colombia 🇨🇴 $11,00 Uruguay 🇺🇾 $10,67 Perú 🇵🇪 $10,65 Honduras 🇭🇳 $5,00 El Salvador 🇸🇻 $2,00 Estados Unidos 🇺🇸 $1,67 España 🇪🇸 $0,19 Ecuador 🇪🇨 $0,01 Paraguay 🇵🇾 Muchas gracias a todos por apoyar la cuenta los amo mucho😭
Español
132
47
4.3K
325K
Dastan
Dastan@Dastan0912·
@maps_black Tengo que mejorar a paraguay pasa tu cuenta voy a ponerme a la par de los yankis
Español
4
0
58
19.4K
Artprice.com
Artprice.com@artpricedotcom·
Titian, Noli me tangere: the masterpiece that tells the story of Easter morning
English
0
42
589
1.2M
Juan y ya
Juan y ya@JuanRodrig17091·
@maps_black México fue el que más donó y le tiras bastante a México
Español
4
0
18
11.4K
Athena
Athena@athenago·
First relief. Then leverage. Then compounding ROI with a dedicated Executive Assistant.
English
0
237
2.2K
20.3M
Rafael
Rafael@rafaelplata_c·
@maps_black Voy a representar mejor a Ecuador, pasa la cuenta, mandaré lo que Paraguay mando
Español
2
0
175
17.5K
Doc
Doc@botdesmedido23·
@maps_black Hdp como le vas a sacar 30 dólares a un cubano jsksjsjsjw
Italiano
11
2
1.8K
35.7K
elite_guerra
elite_guerra@tecamacswinger·
@maps_black Antes de que empiecen los guate-peor a decir que su moneda vale mas que los otros países déjame les digo que siempre que los veo ellos se visten con cosas piratas tenis clone de la más baja calidad y sus casa como las de Afganistán me recuerda a los peruanos desde su cerro
Español
9
0
3
3.7K
Larissa León
Larissa León@Larii_1989·
@maps_black De El Salvador solo de políticos recibiste donación. Porque los salvadoreños no ganan más de mil dólares. Ni los médicos ganan eso jajaja
Español
1
0
0
1.1K
Andre
Andre@Ross289278·
@maps_black No sabía que los españoles eran tan pobres
Español
0
0
0
39
LAW
LAW@trafalgarDdlaw·
@maps_black 🇵🇾: De nada.
GIF
Português
0
0
40
1.4K
noddd
noddd@bcabezas343·
@maps_black Sumando para Ecuayork 💪💪
noddd tweet media
Español
0
0
7
546
Alexander Uslar
Alexander Uslar@Uslariono·
@maps_black Venezuela, el país con la peor economía del continente y por encima de Colombia. Impresionante.
Español
0
0
0
111
Emilio.
Emilio.@Emile_acb·
@maps_black Nmms hasta esta página Pobrentina subsiste gracias a México
Español
0
0
1
1.2K
Michael
Michael@mabscorp·
@maps_black .19 centavos ese ecuatoriano triple HP!
Español
0
0
0
153
Logan Matthew Napolitano
Logan Matthew Napolitano@Propriocetive·
Independent Convergence: When Two Groups Discover the Same Geometry in Neural Networks We are proud — and genuinely humbled — to announce that Professor Yann LeCun's team at NYU/FAIR and Proprioceptive AI have independently converged on the same fundamental discovery about the geometric structure of transformer hidden states. In February 2026, Huang, LeCun, and Balestriero published Semantic Tube Prediction (arXiv:2602.22617), showing that hidden state trajectories trace geodesics on a smooth manifold, decomposing into parallel (signal) and perpendicular (noise) components. Their result: 16× data efficiency improvement by enforcing geodesic straightness. One month earlier, beginning January 27, 2026, we filed four U.S. provisional patents and published our UBM paper (February 3, huggingface.co/loganresearch) disclosing the identical geometric decomposition — what we call the Two-Channel Theorem. Channel 1: a rank-1 residual stream highway carrying next-token prediction. Channel 2: a 4-dimensional behavioral arrangement carrying self-knowledge. Perpendicular at 85.5°. Same math. Different interpretation. Where LeCun's team sees the perpendicular component as noise to suppress, we see it as self-knowledge to read. Both are correct. STP improves the model's ability to predict tokens. Our system — CYGNUS — improves the model's ability to know whether its predictions are good. The geometry supports both readings simultaneously. We commend Professor LeCun and his collaborators. Their work is remarkable. The fact that two independent groups arrived at the same mathematical structure through entirely different methods — one from training efficiency, one from behavioral self-awareness — is the strongest possible evidence that this geometry is real. It is a property of neural computation itself, not an artifact of any single approach. What we present today: 📄 CYGNUS: Combined Definitive Paper (61 pages) — Our complete technical report merging 8 months of research, including the STP convergence analysis, cross-model validation on Qwen-0.5B through 32B (66× parameter gap), the proprioceptive head relay architecture (3,327× above random), and the 74-claim honest audit of what we got right and wrong. 📄 "Mathematics Is All You Need" (458 pages, Zenodo DOI: 10.5281/zenodo.14707164) — The full monograph documenting every discovery, every experiment, every correction. 📄 UBM Paper (February 3, 2026) — Cross-architecture validation on LLaMA-8B and Qwen-3B with separation ratios up to 1,376×. 📄 100+ USPTO patent filings — Including the Koopman operator framework covering dynamical behavioral analysis, prediction, and control. The core thesis: LLMs already know when they're wrong. This self-knowledge lives in the dark Casimir modes of the hidden state geometry — perpendicular to next-token prediction, invisible to the output head, systematically erased by LayerNorm. We read it. LeCun suppresses it. Both approaches work because the geometry is real. The path to safer AI runs through self-awareness. A model that can sense when it's hallucinating is a model that can stop. All work performed on a single NVIDIA RTX 3090. — Logan Matthew Napolitano, Proprioceptive AI, Inc. zenodo.org/records/192238… - Cygnus a Proprioceptive AI Adapter.
Logan Matthew Napolitano tweet mediaLogan Matthew Napolitano tweet mediaLogan Matthew Napolitano tweet mediaLogan Matthew Napolitano tweet media
English
63
155
2.2K
9.1M
Teilen