
Tobias Boehm
1.7K posts

Tobias Boehm
@ToBoehm
#Android developer 🤖 since the G1; IT #freelancer, 🚀 founder of @mndxt_app 🔮




















I found out my girlfriend cheated on me. Instead of breaking up right away, I made a fake account, sent her the proof anonymously, and told her that if she didn’t send me money, I’d tell her boyfriend everything. I shared this whole plan with my best friend for advice, but this mf went behind my back and shared everything with my girlfriend. When confronted, he said Why does it matter? I thought she deserved to know. He wasn’t just betraying me. He was behaving like a random variable after you marginalize out all the hidden information. In probability, to understand what you actually know, you marginalize over hidden variables. That means you sum over all the possibilities you can’t observe to compute the probability of what you can observe. Marginal probability is a statistical measure that represents the probability of a single event by aggregating over all possible values of other variables. Formula P(A) = Σ P(A, Bi) Where P(A) = Marginal probability of event A P(A, Bi) = Joint probability of A and B Σ = Summation Let's take an example and solve step by step A dating app wants to find the probability of users sending messages, regardless of whether they get a response. The data shows message sent vs response received: Short forms - M = Message - R = Response Joint Probability Table - M (Yes), R (Yes) = 0.30 - M (Yes), R (No) = 0.25 - M (No), R (Yes) = 0.10 - M (No), R (No) = 0.35 Step 1 What we want to marginalize - We want P(M = Yes) Step 2 Joint probabilities for M = Yes - P(M = Yes, R = Yes) = 0.30 - P(M = Yes, R = No) = 0.25 Step 3 Apply marginal probability - P(M = Yes) - P(M=Yes, R=Yes) + P(M=Yes, R=No) - 0.30 + 0.25 = 0.55 P(Message = Yes) = 0.55 Final Answer The marginal probability of a user sending a message is 0.55 or 55%, regardless of whether they receive a response. Congratulations, you've just learned Marginal Probability. Bonus: Applications in AI/ML 1. Bayesian Networks: Computing marginal probabilities by summing out irrelevant variables to make predictions and inferences in graphical models. 2. Latent Variable Models: In topic modeling (LDA) and hidden Markov models, marginalizing over hidden states to find the probability of observed data. 3. Feature Selection: Identifying which features independently correlate with target variables by computing marginal distributions, helping reduce dimensionality. 4. Probabilistic Classification: Naive Bayes classifiers use marginal probabilities of features to classify data, assuming independence between features.

#Venezuela: @MAStrackZi zu @tonline: "Europa hat keine Minute zu verlieren, endlich eine robuste Sicherheitspolitik auf den Weg zu bringen. Entweder wir Europäer sitzen mit am weltpolitischen Tisch, oder wir landen auf der Speisekarte." Full Story: t-online.de/nachrichten/au…














