bisaso kuteesa

4.1K posts

bisaso kuteesa

bisaso kuteesa

@kuteesar

Bergabung Mayıs 2011
216 Mengikuti93 Pengikut
bisaso kuteesa
bisaso kuteesa@kuteesar·
@nikita_helene "Janzi telibuuka na nzige!". Grasshopper shouldn't think of swarming with locusts!
English
0
0
0
5
bisaso kuteesa me-retweet
Helena La ShowGirl ❤️‍🔥
we should be very careful not to copy retarded regulations from europe we don't have legacy wealth that can shield us from the immediate consequences of bad policies like the europeans...
wes@weskambale

yesterday, italy declared that netflix's price hikes from 2017 (~€12.99/mo) to 2024 (€19.99/mo) were illegal they ordered netflix to reduce its price and pay customers back every coin they overpaid i looked at our ucc here with what multichoice keeps doin' to us and sighed

English
3
6
26
6.9K
bisaso kuteesa me-retweet
Calc Consulting
Calc Consulting@CalcCon·
Those who don't know, I was an NSF postdoc with @SchmidhuberAI PhD's advisor (Schulten) back in the 90s. 1 of 2 in the country. And my PhD groupmate recently won the Nobel prize for AlphaFold. So I have some qualifications here to say 𝐲𝐞𝐚𝐡 𝐭𝐡𝐢𝐬 𝐢𝐬 𝐩𝐫𝐞𝐭𝐭𝐲 𝐚𝐜𝐜𝐮𝐫𝐚𝐭𝐞. The core learning principle behind JEPA is predicting one representation from another in latent space. And this was already explicitly formulated in the early 1990s PMAX work. PMAX does not merely hint at this idea; it sets up the same structure: two related inputs are encoded, and a predictor learns to map one latent representation to the other, while the encoder is trained to make this prediction possible without collapsing the representation. That is exactly the defining mechanism of JEPA. When you strip away modern terminology and architectures, both are instances of the same objective: learn representations by maximizing cross-view predictability under constraints that preserve information. What JEPA adds is not a new theoretical framework. It's just larger models, better architectures, and scaling. Of course, we could not do that in the 90s. In that sense, Jürgen Schmidhuber made the real and original conceptual breakthrough: non-generative, latent-to-latent predictive learning This is typical of @ylecun 's work; it's mostly derivative of others' ideas, scaled up and promoted. In contrast, @SchmidhuberAI really did pioneer a lot of these ideas. The JEPA work should have cited him. Politics >> Integrity.
Jürgen Schmidhuber@SchmidhuberAI

Dr. LeCun's heavily promoted Joint Embedding Predictive Architecture (JEPA, 2022) [5] is the heart of his new company. However, the core ideas are not original to LeCun. Instead, JEPA is essentially identical to our 1992 Predictability Maximization system (PMAX) [1][14]. Details in reference [19] which contains many additional references. Motivation of PMAX [1][14]. Since details of inputs are often unpredictable from related inputs, two non-generative artificial neural networks interact as follows: one net tries to create a non-trivial, informative, latent representation of its own input that is predictable from the latent representation of the other net’s input. PMAX [1][14] is actually a whole family of methods. Consider the simplest instance in Sec. 2.2 of [1]: an auto encoder net sees an input and represents it in its hidden units (its latent space). The other net sees a different but related input and learns to predict (from its own latent space) the auto encoder's latent representation, which in turn tries to become more predictable, without giving up too much information about its own input, to prevent what's now called “collapse." See illustration 5.2 in Sec. 5.5 of [14] on the "extraction of predictable concepts." The 1992 PMAX paper [1] discusses not only auto encoders but also other techniques for encoding data. The experiments were conducted by my student Daniel Prelinger. The non-generative PMAX outperformed the generative IMAX [2] on a stereo vision task. The 2020 BYOL [10] is also closely related to PMAX. In 2026, @misovalko, leader of the BYOL team, praised PMAX, and listed numerous similarities to much later work [19]. Note that the self-created “predictable classifications” in the title of [1] (and the so-called “outputs” of the entire system [1]) are typically INTERNAL "distributed representations” (like in the title of Sec. 4.2 of [1]). The 1992 PMAX paper [1] considers both symmetric and asymmetric nets. In the symmetric case, both nets are constrained to emit "equal (and therefore mutually predictable)" representations [1]. Sec. 4.2 on “finding predictable distributed representations” has an experiment with 2 weight-sharing auto encoders which learn to represent in their latent space what their inputs have in common (see the cover image of this post). Of course, back then compute was was a million times more expensive, but the fundamental insights of "JEPA" were present, and LeCun has simply repackaged old ideas without citing them [5,6,19]. This is hardly the first time LeCun (or others writing about him) have exaggerated LeCun's own significance by downplaying earlier work. He did NOT "co-invent deep learning" (as some know-nothing "AI influencers" have claimed) [11,13], and he did NOT invent convolutional neural nets (CNNs) [12,6,13], NOR was he even the first to combine CNNs with backpropagation [12,13]. While he got awards for the inventions of other researchers whom he did not cite [6], he did not invent ANY of the key algorithms that underpin modern AI [5,6,19]. LeCun's recent pitch: 1. LLMs such as ChatGPT are insufficient for AGI (which has been obvious to experts in AI & decision making, and is something he once derided @GaryMarcus for pointing out [17]). 2. Neural AIs need what I baptized a neural "world model" in 1990 [8][15] (earlier, less general neural nets of this kind, such as those by Paul Werbos (1987) and others [8], weren't called "world models," although the basic concept itself is ancient [8]). 3. The world model should learn to predict (in non-generative "JEPA" fashion [5]) higher-level predictable abstractions instead of raw pixels: that's the essence of our 1992 PMAX [1][14]. Astonishingly, PMAX or "JEPA" seems to be the unique selling proposition of LeCun's 2026 company on world model-based AI in the physical world, which is apparently based on what we published over 3 decades ago [1,5,6,7,8,13,14], and modeled after our 2014 company on world model-based AGI in the physical world [8]. In short, little if anything in JEPA is new [19]. But then the fact that LeCun would repackage old ideas and present them as his own clearly isn't new either [5,6,18,19]. FOOTNOTES 1. Note that PMAX is NOT the 1991 adversarial Predictability MINimization (PMIN) [3,4]. However, PMAX may use PMIN as a submodule to create informative latent representations [1](Sec. 2.4), and to prevent what's now called “collapse." See the illustration on page 9 of [1]. 2. Note that the 1991 PMIN [3] also predicts parts of latent space from other parts. However, PMIN's goal is to REMOVE mutual predictability, to obtain maximally disentangled latent representations called factorial codes. PMIN by itself may use the auto encoder principle in addition to its latent space predictor [3]. 3. Neither PMAX nor PMIN was my first non-generative method for predicting latent space, which was published in 1991 in the context of neural net distillation [9]. See also [5-8]. 4. While the cognoscenti agree that LLMs are insufficient for AGI, JEPA is so, too. We should know: we have had it for over 3 decades under the name PMAX! Additional techniques are required to achieve AGI, e.g., meta learning, artificial curiosity and creativity, efficient planning with world models, and others [16]. REFERENCES (easy to find on the web): [1] J. Schmidhuber (JS) & D. Prelinger (1993). Discovering predictable classifications. Neural Computation, 5(4):625-635. Based on TR CU-CS-626-92 (1992): people.idsia.ch/~juergen/predm… [2] S. Becker, G. E. Hinton (1989). Spatial coherence as an internal teacher for a neural network. TR CRG-TR-89-7, Dept. of CS, U. Toronto. [3] JS (1992). Learning factorial codes by predictability minimization. Neural Computation, 4(6):863-879. Based on TR CU-CS-565-91, 1991. [4] JS, M. Eldracher, B. Foltin (1996). Semilinear predictability minimization produces well-known feature detectors. Neural Computation, 8(4):773-786. [5] JS (2022-23). LeCun's 2022 paper on autonomous machine intelligence rehashes but does not cite essential work of 1990-2015. [6] JS (2023-25). How 3 Turing awardees republished key methods and ideas whose creators they failed to credit. Technical Report IDSIA-23-23. [7] JS (2026). Simple but powerful ways of using world models and their latent space. Opening keynote for the World Modeling Workshop, 4-6 Feb, 2026, Mila - Quebec AI Institute. [8] JS (2026). The Neural World Model Boom. Technical Note IDSIA-2-26. [9] JS (1991). Neural sequence chunkers. TR FKI-148-91, TUM, April 1991. (See also Technical Note IDSIA-12-25: who invented knowledge distillation with artificial neural networks?) [10] J. Grill et al (2020). Bootstrap your own latent: A "new" approach to self-supervised Learning. arXiv:2006.07733 [11] JS (2025). Who invented deep learning? Technical Note IDSIA-16-25. [12] JS (2025). Who invented convolutional neural networks? Technical Note IDSIA-17-25. [13] JS (2022-25). Annotated History of Modern AI and Deep Learning. Technical Report IDSIA-22-22, arXiv:2212.11279 [14] JS (1993). Network architectures, objective functions, and chain rule. Habilitation Thesis, TUM. See Sec. 5.5 on "Vorhersagbarkeitsmaximierung" (Predictability Maximization). [15] JS (1990). Making the world differentiable: On using fully recurrent self-supervised neural networks for dynamic reinforcement learning and planning in non-stationary environments. Technical Report FKI-126-90, TUM. [16] JS (1990-2026). AI Blog. [17] @GaryMarcus. Open letter responding to @ylecun. A memo for future intellectual historians. Substack, June 2024. [18] G. Marcus. The False Glorification of @ylecun. Don’t believe everything you read. Substack, Nov 2025. [19] J. Schmidhuber. Who invented JEPA? Technical Note IDSIA-3-22, IDSIA, Switzerland, March 2026. people.idsia.ch/~juergen/who-i…

English
6
29
215
27.1K
bisaso kuteesa
bisaso kuteesa@kuteesar·
@CKyobutungi Kind reminder that North Koreans who don't particularly participate in international conferences have a functional space program. The idiom for really hard intellectual stuff is "rocket science".
bisaso kuteesa tweet media
English
0
0
0
1
Dr. Catherine Kyobutungi Muzukulu wa Bityo
Because of the recently introduced US visa restrictions, we had to pay USD 30K as visa bonds for two Ugandan researchers to attend a conference. Think about this for a moment ⁉️‼️
English
107
266
1.1K
111.9K
bisaso kuteesa
bisaso kuteesa@kuteesar·
@welandfab First be calming downoo! It's too early to be talking about the Chinese. You qre doing a good job though!
English
0
0
3
609
bisaso kuteesa
bisaso kuteesa@kuteesar·
@RugyendoQuotes @BarackObama What I'm trying to say is that he once lectured the African president in Adis ababa as if he was wiser. Now he acknowledges that they can in fact be wiser and more foresighted than him thanks to the experience he was lecturing them against. He's now experienced.
English
0
0
0
7
Amon 👷
Amon 👷@rwenzori_·
Sikander Lalani (Roofings Group) an industrialist worth an estimated $300 million entered the business world in the 1970s by opening an electronics retail shop in Kigali, Rwanda, after transitioning from a career in medicine as a histopathologist His transition to manufacturing was sparked by his Japanese electronics suppliers. In 1976, they proposed the idea of setting up a metallic roofing line in Rwanda. Despite lacking initial capital, he secured a $1 million loan from the World Bank through the Rwandese Bank of Development. In 1978, he established his first roofing factory in Kigali. Following the 1994 Rwandan genocide, Lalani fled to Tanzania before eventually settling in Uganda. In 1994, he invested $2 million to start Roofings Limited with a small workshop in Lubowa, initially employing 60 people. This venture eventually grew into the Roofings Group, now a multi-million-dollar steel conglomerate. His Key Properties/Businesses: Founder of the Roofings Group, the largest manufacturer of steel products in East and Central Africa. Guys, Don't wait, Just start.
Amon 👷 tweet media
English
7
7
62
7.1K
bisaso kuteesa
bisaso kuteesa@kuteesar·
@TonyNatif The most unpresidential thing a president has ever done! Almost got impeached for that but got off with a strong warning as in "Tokidangamu"! 😅
English
1
0
8
741
Anthony Natif
Anthony Natif@TonyNatif·
I’m old enough to remember when the biggest Presidential scandal in American news was this man wearing a tan suit!
Anthony Natif tweet media
English
25
5
153
13.3K
Itamar Golan 🤓
Itamar Golan 🤓@ItakGol·
Gauss meets real life. Also - Notice how people lifting 95 already say, “Fuck it, let’s do 100” - so there’s a discontinuity point. Mathematical theory faces reality.
Itamar Golan 🤓 tweet media
English
267
3.7K
80.1K
9.2M
Nikko Mwash
Nikko Mwash@Nikomwangi·
@Abdikarindahirr That is why they are called low iqs. somalis have basically been transferring the economy of somalia to Kenya and Uganda and they think that is success. Utter stupidity!!
English
1
0
2
267
Sedrack Atuhaire
Sedrack Atuhaire@SedrackAtuhaire·
1. The NDHPA Bill 2025 is indeed about Public Interest. However, "Public Interest" is best served by competence, not exclusivity. By restricting drug manufacturing supervision to a single profession, we risk bottlenecking the very innovation Uganda needs to thrive. #NDHPABill
Sedrack Atuhaire tweet mediaSedrack Atuhaire tweet media
Abiaz Rwamwiri@AbiazRwamwiri

This National Drug and Public Health Authority Bill has been discussed widely; from holding stakeholder consultations at NDA, Attorney General’s Office and First Parliamentary Council and intensively by the Parliamentary Committee on Health (by the way that committee of parliament is chaired by a medical doctor and has competent medical professionals). Several stakeholders gave their views but also the global best practices as guided by WHO were put in consideration! Your good position was looked into and what would work was considered and what was not appropriate was left out! No one is reinventing the wheel here but streamlining the sector that is so critical to our survival as a society! There comes a time when right has to be done, this bill isn’t about Pharmacists or anyone but the Public Interest. The sector will need everyone but it must be guided and led by the right people! This is a global practice, we love to refer to progressing countries but we shy away from paying the price to get there! We are going to breakdown this document page by page as soon as the president signs it into law!

Kampala, Uganda 🇺🇬 English
4
16
26
2.1K
bisaso kuteesa
bisaso kuteesa@kuteesar·
@OlenaRohoza France looking to create a group it can dominate by stealing members from other pre-existing groups! He thinks he is slick😂
English
0
0
0
12
Olena Rohoza
Olena Rohoza@OlenaRohoza·
Emmanuel Macron: “If you bomb countries because you disagree with their regime, it opens a Pandora’s box. Iran is a bad regime, but bombing doesn’t work. Iraq, Afghanistan, Libya — 20 years of interventions have produced no results. Change must come from within, from the people. We do not want to depend on China’s dominance or be overly vulnerable to the unpredictability of the United States. France does not want to be a vassal of any power. There is a ‘third path’ — uniting Korea, Europe, Canada, Japan, India, Brazil, and Australia. It’s a format for countries that do not want to depend on China or automatically side with the U.S. France is facing growing risks due to Russia’s war against Ukraine. China is trying to control supply chains and critical minerals. The United States is reshaping trade through tariffs and extraterritorial measures.”
Olena Rohoza tweet media
English
972
1.6K
6.7K
567.5K
bisaso kuteesa
bisaso kuteesa@kuteesar·
@SedrackAtuhaire @RugyendoQuotes @JenaHerbals I know the details and have a broad perspective. But I'm also pragmatic and alive to the regulatory challenges of our local context. Given their limited resources, the regulator is better off exploiting parallel regulatory frameworks when determining who/ how to hold responsible.
English
1
0
1
10