Yansoki

240 posts

Yansoki banner
Yansoki

Yansoki

@JustXTech

Coalitional Game Theory, AI

Katılım Eylül 2022
40 Takip Edilen22 Takipçiler
Yansoki
Yansoki@JustXTech·
I just watched "I swear", on John Davidson...a movie that had me crying, laughing, and ultimately feeling admiration for someone who has done so much, when life gave him so little...the power to move things, question yourself , and inspire...Great movie
English
0
0
0
16
Ashutosh Maheshwari
Ashutosh Maheshwari@asmah2107·
It is truly impressive that we have developed H100 GPUs capable of trillions of operations per second, just so we can use them to run a Python wrapper that waits 10 seconds for a JSON response from a LLM. We have achieved the pinnacle of hardware efficiency and software laziness simultaneously.
English
30
25
629
51.6K
Akash Ranjan
Akash Ranjan@akashbitm787·
technically, the h100 isn't even "working" that hard during that 10 seconds. inference is memory-bound, not compute-bound. the gpu is spending most of its time waiting on HBM to fetch weights, not crunching numbers. we built a ferrari and we're driving it in first gear because of the memory wall.
English
3
0
28
3.2K
Yansoki
Yansoki@JustXTech·
>50,000 parameters on MNIST and barely 90% accuracy is signs of a very poor model....Higher performance is achievable with <10k parameters, this is nothing impressive or any significant breakthrough
Brian Roemmele@BrianRoemmele

Breakthrough: Game-Theoretic Pruning Slashes Neural Network Size by Up to 90% with Near-Zero Accuracy Loss: Unlocking Edge AI Revolution! I am testing this now on local AI and it is astonishing! introduced Pruning as a Game. Equilibrium-Driven Sparsification of Neural Networks, a novel approach that treats parameter pruning as a strategic competition among weights. This method dynamically identifies and removes redundant connections through game-theoretic equilibrium, achieving massive compression while preserving – and sometimes even improving – model performance. Published on arXiv just days ago (December 2025), the paper demonstrates staggering results: sparsity levels exceeding 90% in large-scale models with accuracy drops of less than 1% on benchmarks like ImageNet and CIFAR-10. For billion-parameter behemoths, this translates to drastic reductions in memory footprint (up to 10x smaller), inference speed (2-5x faster on standard hardware), and energy consumption – all without the retraining headaches of traditional methods. Why This Changes Everything Traditional pruning techniques – like magnitude-based or gradient-based removal – often struggle with “pruning regret,” where aggressive compression tanks performance, forcing costly fine-tuning cycles. But this new equilibrium-driven framework flips the script: parameters “compete” in a cooperative or non-cooperative game, where the Nash-like equilibrium reveals truly unimportant weights. The result? Cleaner, more stable sparsification that outperforms state-of-the-art baselines across vision transformers, convolutional nets, and even emerging multimodal architectures. Key highlights from the experiments: •90-95% sparsity on ResNet-50 with top-1 accuracy loss <0.5% (vs. 2-5% in prior SOTA). •Up to 4x faster inference on mobile GPUs, making billion-parameter models viable for smartphones and IoT devices. •Superior robustness: Sparse models maintain performance under distribution shifts and adversarial attacks better than dense counterparts. This isn’t just incremental – it’s a paradigm shift. Imagine running GPT-scale reasoning on your phone, real-time video analysis on drones, or edge-based healthcare diagnostics without cloud dependency. By reducing the environmental footprint of massive training and inference, it also tackles AI’s growing energy crisis head-on. The implications ripple across industries: •Mobile & Edge AI: Affordable on-device intelligence explodes. •Green Computing: Lower power draw for data centers and devices. •Democratized AI: Smaller models mean broader access for startups and developing regions. As AI scales toward trillion-parameter frontiers, techniques like this are essential to keep progress practical and inclusive. Pruning as a Game: Equilibrium-Driven Sparsification of Neural Networks
(PDF: arxiv.org/pdf/2512.22106) I will continue my testing but thus far results are robust!

English
0
0
1
113
Yansoki
Yansoki@JustXTech·
@aj_agr @copyconstruct Yupp, we have a tool to help you search your logs while they are still compressed
English
0
0
0
6
Ajaya
Ajaya@aj_agr·
@copyconstruct Wonder if there are people/companies helping optimise DD spend!
English
1
0
3
1.8K
Cindy Sridharan
Cindy Sridharan@copyconstruct·
Sometimes, it’s hard to believe the Datadog is a 55billion public company. For reference, Cloudflare’s valuation is 67billion, and Cloudflare’s the backbone of the entire internet. Datadog needs to be a special case study, because no other monitoring company even comes close.
English
41
25
984
80.2K
Yansoki
Yansoki@JustXTech·
@lucastech @LorenCharnley @DanielLockyer We have a tool that would help you store them in compressed form and still search/manipulate them...also support streaming merging so no in memory decompression...I think you need something like that
English
0
0
0
17
Lucas Tech
Lucas Tech@lucastech·
@LorenCharnley @DanielLockyer I'm not storing any of it long term at the moment, just reviewing it when I need to get to the root of one of the servers keeling over from abuse. but it's crossed my mind to store them, because there have been some very interesting trends over time that I wish I had history for
English
1
0
0
23
Daniel Lockyer
Daniel Lockyer@DanielLockyer·
It kinda sucks how expensive app instrumentation can be You can run your app on a $5/mo VPS but then the logging/tracing solutions end up being hundreds per month So your only options are to instrument less of your app, or to heavily sample Interesting economics
English
50
5
286
73.8K
Yansoki
Yansoki@JustXTech·
@davidgu @abdulkarim_me We've developed a general purpose tool that enables you to manipulate your logs while they are still compressed..could help you save some of that
English
0
0
0
10
David Gu
David Gu@davidgu·
@abdulkarim_me we use datadog ($$$) for a subset of things the really high volume or high cardinality stuff goes into prometheus or direct into S3
English
4
0
24
24.7K
David Gu
David Gu@davidgu·
we run 18 million EC2 instances per month. At our scale, we see very rare bugs very frequently. Last week, we received *half* an HTTP request. Not a HTTP 206, literally half a request. Content-Length was 2350 bytes. Body was actually 1200 bytes, and was truncated mid json doc.
English
168
100
3.9K
1.4M
Yansoki retweetledi
Ifenimi
Ifenimi@Ifenimiii·
Yesterday, I listened to someone speak about themselves in such a deprecating and casually cruel manner, and it scraped against my senses with a grating sound. Right there and then, I made a vow never to speak of myself in such a diminished tone; not even as a joke. Because truly, the greatest dishonour a person can commit against themselves is to enrol their own soul into a concentration camp for the small. There is no prize for shrinking, nor is there a reward for repeatedly declaring yourself unworthy or incapable. What begins as self-effacing humour insidiously mutates into a worldview that trains both you and everyone around you to expect little from your existence. The implacable consequence of persistently calling yourself small, is that you prime the world to treat you that way. You are judged by your own tongue long before you are offered the chance to demonstrate your mettle. If that isn’t a self-orchestrated tragedy, I don’t know what is.
English
23
345
1.8K
57.6K
Yansoki
Yansoki@JustXTech·
@Ifenimiii 😭😭nooo...this is your internal chaos
English
1
0
0
180
Ifenimi
Ifenimi@Ifenimiii·
My inbuilt disc jockey wakes up before I do and immediately starts wilding. It curates a soundtrack that makes no logical sense yet insists on announcing itself like a chaotic little maestro. Because how else do you explain opening my eyes in the morning only to find that the first song tugging at my spirit is Kuchi Kuchi Hota Hai, swiftly followed—without shame or sequence—by Ça fait mal, and then Michael Jackson’s Smooth Criminal?
English
8
9
79
5.3K
Yansoki retweetledi
Satya Nadella
Satya Nadella@satyanadella·
I’ve been thinking a lot about what the net benefit of the AI platform wave is. The real question is how to empower every company out there to get more out of this platform shift and build their own AI native capabilities and enterprise value (vs inadvertently just transfer their unique value to the tech sector!!). Bill famously said a platform is when the economic value of everybody that uses it exceeds the value of the company that creates it. That’s the essence of the positive-sum future. Even in our somewhat zero-sum mindset industry, we can create partnerships that create value for all parties involved. Our partnership with OpenAI is a great example. Our investment helped them scale; their research accelerated our own innovation. That’s what healthy platforms and partners do—they catalyze and compound progress. There’s no better proof than what we announced just this week. The world’s first AI superfactory was co-designed with OpenAI and informed by three generations of AI supercomputers we built for frontier model training and inference. It was also a result of working closely with Nvidia and getting better at the full stack optimization from model architecture to micro-architecture of the chip and everything between three companies! We also did the work to bring AMD into the fleet doing inference of GPT models, which enabled them to get up to speed on their own software stack for AI. And now all this infrastructure will scale to support every startup to enterprise doing their own training to inference. You can see the same dynamic in coding. Thanks to AI, the category itself has expanded and may ultimately become one of the largest software categories. I don’t ever recall any analyst ever asking me about how much revenue Visual Studio makes! But now everyone is excited about AI coding tools. This is another aspect of positive sum, when the category itself is redefined and the pie becomes 10x what it was! With GitHub Copilot we compete for our share and with GitHub and Agent HQ we also provide a platform for others. Of course, the real test of this era won’t be when another tech company breaks a valuation record. It will be when the overall economy and society themselves reach new heights. When a pharma company uses AI in silico to bring a new therapy to market in one year instead of twelve. When a manufacturer uses AI to redesign a supply chain overnight. When a teacher personalizes lessons for every student. When a farmer predicts and prevents crop failure. That’s when we’ll know the system is working. Let us move beyond zero-sum thinking and the winner-take-all hype and focus instead on building broad capabilities that harness the power of this technology to achieve local success in each firm, which then leads to broad economic growth and societal benefits. And every firm needs to make sure they have control of their own destiny and sovereignty vs just a press release with a Tech/AI company or worse leak all their value through what may seem like a partnership, except it's extractive in terms of value exchange in the long run. We know that the Internet wave had tremendous positive sum impact in the world, and yet we also had some sectors that got hollowed out like local media. This time around we have the opportunity to ensure broad diffusion of this tech with choice and control that is distributed to ensure positive sum outcomes across the board. At the end of the day, this new technological wave gives us the opportunity to dream bigger and set higher ambitions for what we can collectively achieve. Each of us will need to play our part!
English
844
823
6K
1.4M
Yansoki
Yansoki@JustXTech·
@Ifenimiii Loving anyone, is also willing to let them go....Love in itself is not selfish imo...if MJ has decided to leave, then I'll cherish what we had, and let go of what I hoped we would have...its not easy, but loving someone selflessly, and loving yourself, makes it easier, for me🥲
English
0
0
1
252
Ifenimi
Ifenimi@Ifenimiii·
“God abeg,” she muttered under her breath as the curtains swayed alongside her emotions. The evening sun strode into the room through the pocket of space the half-open window had created, spilling a deep sepia warmth over everything it touched. Dust motes floated lazily in the slanted light, rising and falling like tiny, indifferent planets. She stared into that haze without truly seeing, her eyes catching briefly on the half-eaten plate of eba and egusi soup now sitting cold on the dining table. She imperceptibly placed her left hand on her belly and a trembling breath fought its way out of her chest. Driven by muscle memory, she dialled the same number she had called over thirty times in the past hour. The phone began to ring again. “MJ pick your call… please… please… please.” She muttered to herself like a wild dog five breaths away from going fully feral. MJ was supposed to be different. He had felt different. And why wouldn’t he? Those conversations. They had possessed an intoxicating quality and she soon began to get high on them. They spoke about everything and nothing long into the night, until the sound of his voice began to feel like a place she could rest in. Two weeks of MJ in her ears was all it took for her to let him into her body. Her throat tightened. How do you go from grieving the sudden disappearance of a man you laid your heart before like an offering on the communion table, only to learn a few weeks later that you’re carrying his child? Dazed and defeated, she folded into her cuddle sofa facing the living room door. The fabric smelled faintly of her vanilla body spray. She sank deeper into the soft cushion, willing the seat to absorb her confusion. Her brows drew together in a distant, unfocused knot, and she bit her lower lip until the sharp taste of blood snapped her back into her body. It was beginning to feel like the universe had reserved its murkiest waters exclusively for her to wade through. A loud, bitter laugh tore out of her. This wasn’t the first time she was bleeding from wounds of this nature. She exhaled all the way down to her toes, and it suddenly struck her that she was angry. Her body vibrated with the red heat of it, and all ten fingers of her fury were aimed squarely at herself. How could she have been so gullible? How had she let her heart open again? And within such a short time? How? Only fools keep trusting the same gun that has killed them in many lifetimes. Restless and fevered, she rose and moved toward the wall until only a few millimetres separated her from it. Up close, she could see the tiny cracks in the paint. She wanted to break something. Anything. But the only thing that felt sufficient was her own skull. “I am a fool,” she muttered forcefully, her head repeatedly knocking against the wall in a grim, deliberate rhythm that matched the words. “I. Am. A. Fool.”
𝕿𝖔𝖕𝕭𝖔𝖞@theokorie

“God abeg,” she muttered under her breath as the curtains swayed alongside her emotions. The evening sun strode into the room through the pocket of space the half-open window had created… You can take it from there. I trust you🙂‍↔️

English
17
34
185
15.1K
Yansoki
Yansoki@JustXTech·
@Ifenimiii Judge people by the way they treat someone they can't benefit from...never fails
English
1
3
11
1K
Ifenimi
Ifenimi@Ifenimiii·
When it comes to letting people close, there is no foolproof way to avoid walking straight into your own undoing. No checklist can save you from the mysteries of another soul. But I do know where you must never linger, and it is beside someone who delights in tarnishing everyone else in your presence. People like that are ticking bombs with names etched on the casing. If their tongue is sharpened enough to slice through everyone around them, understand this; your turn will come. You will not be an exception; you are simply the newest offering for their bitter appetites, and fresh fodder for their practiced malice.
English
7
260
940
47.5K
Yansoki
Yansoki@JustXTech·
@Ifenimiii 😂😂😂😂....would you rather people love your writing and only it , judging you by its quality....or love everything from the mind that gives them precious life lessons from time to time...God a beg
English
0
0
0
79
Ifenimi
Ifenimi@Ifenimiii·
I have a complicated, almost combustible relationship with writing. I’ve been sitting at my desk for nearly three hours now, wrestling with sentences that refuse to be born, and the only thing I’ve managed to put down is “God abeg.”
English
27
64
488
14.4K
Yansoki
Yansoki@JustXTech·
voient....imbécile
Français
0
0
0
44
Yansoki
Yansoki@JustXTech·
#PourquoiJeProteste Parcequ'il y'a 9 ans, j'ai vu des hommes violer impunément des étudiantes à Buea, déshumaniser une population dite Anglophone, et plonger des familles dans une souffrance inimaginable...je ne veux plus avoir peur et me contenter de ce qui est
Français
0
3
4
174