

Ed Kroc
1.4K posts

@ed_kroc
Statistician, ecologist, gull lover at UBC. Officially, I’m an Associate Professor of Measurement, Evaluation, and Research Methodology.





@martinmbauer Disrespect for elementary mathematics across various levels of education, college included, is a uniquely US problem, among developed countries...


Today in 2017: @Buck: "This to send the Packers into the NFC Championship game. It iiiiiisssssss GOOD! The Packers are moving on! Aaron Rodgers has done it again! And Mason Crosby...a hero for Green Bay!"

And your quarterback ...



This is one of the weirdest things I’ve ever learnt.🤨😦🤯 We’ve been so conditioned to think in 1, 2, and 3 dimensions that our intuition basically lives there, but once you step into high dimensions even something as basic as distance starts behaving in a way that feels wrong. In a high-dimensional ball, almost all the volume lives in a thin shell near the boundary...shrink the radius just a tiny bit and you’ve thrown away almost everything, so random points don’t sit in the middle at all, they’re crushed into a microscopic halo at the edge. That’s one of the reasons distance, nearest neighbours, and geometric intuition start acting so weird in high-dimensional Machine Learning spaces. High-dimensional geometry is quietly telling you your low-dimensional brain is lying to you. 🤯 #HighDimensionalSpace #MachineLearning

37 year old Jerry Seinfeld in his prime quarterbacks a masterpiece with some of the funniest people on the planet. David Spade, Adam Sandler, Chris Farley, Chris Rock & Siobhan Fallon deliver #comedyroyalty #snl 1992


Today in 1994: Sterling Sharpe (@Thro284) ties a franchise record with four receiving TDs, but big days by Dallas' Jason Garrett (311 pass yds, 2 TDs) and Emmitt Smith (228 total yds, 2 TDs) result in 42-31 Packer loss in Dallas.

Professors having 50 papers per year is exactly the problem. That’s almost a paper a week. The system is broken because we end up with a few professors clogging all the resources and essentially overseeing paper mills.

Regularization techniques prevent machine learning models from overfitting—memorizing noise instead of learning patterns. Methods like L1 (Lasso) and L2 (Ridge) add a penalty to the loss function, shrinking weights to simplify the model. This ensures the AI generalizes well to new, unseen data. In real life, this robustness is critical for autonomous driving to ignore sensor noise and for medical AI to make reliable diagnoses across different hospitals.





