Divine

1.1K posts

Divine banner
Divine

Divine

@cdivine304

i work with computers

Bengaluru, India Katılım Haziran 2017
591 Takip Edilen95 Takipçiler
Divine
Divine@cdivine304·
Somethings are impossible for someone to understand and follow even though it can be as simple as avoiding certain types of food for their own well being
English
0
0
0
3
Divine
Divine@cdivine304·
@pcshipp macbook pro hardware config please?
English
0
0
0
64
pc
pc@pcshipp·
Cancelled calude code subscription for April month, just moved this setup - MacBook pro - Gemma 4 No internet, no API costs, no limits
English
236
79
3.1K
322.4K
Divine
Divine@cdivine304·
@DeadlyLaw "Sadly, Gulati did not live to see the end of this case." --> I'm not sure if i read the end of this case in this tweet.
English
0
0
1
81
Deadly Law
Deadly Law@DeadlyLaw·
In 2000, Sushil Gulati tried to stop a sexual assault. He ended up being framed in a gang rape case. Years later, the Delhi High Court, through Justice Chandrasekharan Sudha, has upheld the conviction of a lawyer and a police officer who fabricated that case. But by then, Sushil Gulati had already died during trial. Justice Sudha found that this was a case of deliberate conspiracy. A false complaint was created, witnesses were paid, and a narrative was built to implicate Gulati. What followed was worse. He was subjected to custodial violence, forced to appear in court repeatedly, and yet not examined. The Court noted that even the trial court failed to step in and prevent this harassment, allowing adjournments casually and letting the process itself become punishment. The judgment also makes a larger point about misuse of power. A lawyer, expected to assist the court, and a police officer, duty-bound to prevent crime, instead colluded to frame an innocent man. Justice Sudha observed that this was a fit case for a harsher sentence, but the State’s failure to appeal meant the punishment remained unchanged. Cases like this are uncomfortable because they expose a reality we don’t like to discuss. False cases, especially in serious offences like rape, don’t just destroy individual lives, they also weaken trust in the system, and are rising in numbers. This does not take away from the reality of genuine cases, but it shows how dangerous it is when the law is used as a tool for vendetta. Sadly, Gulati did not live to see the end of this case.
English
26
513
1.3K
81.5K
Divine
Divine@cdivine304·
@gharkekalesh he was conscious enough to not disturb the buffaloes.
English
0
0
0
2.7K
Ghar Ke Kalesh
Ghar Ke Kalesh@gharkekalesh·
A single dog attacked and bit multiple youths in under 30 seconds.😨
English
1K
1.9K
17.3K
2.3M
Divine
Divine@cdivine304·
@WorldlyHQ hard to believe there wont be leaks
English
0
0
0
10
Worldly
Worldly@WorldlyHQ·
Hello 👋
English
1
8
32
7.2K
Dave
Dave@GamewithDave·
For those who used a computer between 1995 and 2001, what's the computer game from that time that sticks with you the most, and why?
English
12.2K
147
3.9K
2.1M
Divine
Divine@cdivine304·
@hyderabaddoctor now that the root cause is identified, what was the corrective action taken?
English
1
0
1
20
Dr Sudhir Kumar MD DM
Dr Sudhir Kumar MD DM@hyderabaddoctor·
9/ ▶️The real lesson here is that tests don’t make diagnoses. Diagnosis need critical thinking and analysis of symptoms. ▶️If you scan the wrong area, you will get the wrong answer, even with the best machines.
English
3
7
71
14.8K
Dr Sudhir Kumar MD DM
Dr Sudhir Kumar MD DM@hyderabaddoctor·
The Diagnosis That Was Missed, Until History Spoke 1/ A 55-year-old man had back pain for 4 months. He did everything right. He consulted local doctors Blood tests, Nerve conduction studies and MRI (neck & lower back) were normal Yet, his pain did not go away.
Dr Sudhir Kumar MD DM tweet media
English
38
54
441
132.9K
Divine
Divine@cdivine304·
@aibytekat Did she gently tell you not to post the conversation in one single tweet even if you can?
English
0
0
0
267
Katyayani Shukla
Katyayani Shukla@aibytekat·
I told my therapist: “I feel like I’m running out of time to build the life I want.” She didn’t even ask why. She just looked at me gently and said:
English
226
2.9K
28.3K
6.4M
Yi Ma
Yi Ma@YiMaTweets·
Anyone who claims to be interested in Intelligence should know optimal control theory developed in 1960s, or even cybernetics, as early as in 1940s... My new book was trying to set straight the related history and concepts...
Yann LeCun@ylecun

The basic idea of world models is very old. Optimal control folks were using model-based planning in the 1960s (using the "adjoint state" methods, which deep learning people would now call "backprop through time"). But the real question is what you do with this idea and how you reduce it to practice.

English
13
63
741
78K
Divine
Divine@cdivine304·
@tsotchke python, distributed systems - yes. remaining topics are new to me. iam interested
English
0
0
1
57
tsotchke
tsotchke@tsotchke·
if you're a software engineer who's talented at C/C++, Python, distributed systems, GPGPU, and has considerable interests in physics-based and high performance computing looking to do some bleeding edge AI development please contact me
English
69
54
958
54.5K
Divine
Divine@cdivine304·
is this why mac is chosen instead of laptops with linux os?
English
0
0
0
17
Divine
Divine@cdivine304·
i'm looking at a laptop to purchase. AMD Ryzen™ AI 7 PRO 350 Processor looks good to me. It comes with MediaTek Wi-Fi 7 MT7925 2x2 BE & Bluetooth® 5.4. based on reviews, MediaTek seem to have some stability issue with ubuntu os and suggestion is to replace it with intel driver
English
1
0
0
50
Divine
Divine@cdivine304·
@bluemontauk he mentioned "cheap grade stainless steel" is harmful which everyone is aware of.
English
0
0
0
172
bluemontauk
bluemontauk@bluemontauk·
Stainless Steel Bottles 😳😳😳
English
429
2.9K
10.8K
1.7M
Divine
Divine@cdivine304·
anything which makes you better can replace you
English
0
0
0
33
Divine
Divine@cdivine304·
@poezhao0605 why are you posting the "internal memo" of the company to public?
English
0
0
1
74
Poe Zhao
Poe Zhao@poezhao0605·
Moonshot AI CEO revealed in an internal memo that the company closed $500M Series C at $4.3B valuation, led by IDG with Alibaba and Tencent oversubscribing. Per the memo: $1.4B cash on hand. Kimi’s global paid users up 170% month-over-month. Overseas API revenue jumped 4x after launching K2 Thinking model.
English
9
31
394
62.7K
Divine
Divine@cdivine304·
@tankots what do you mean by "cracking the algorithm" ?
English
0
0
0
24
Tanay Kothari
Tanay Kothari@tankots·
we just hired an intern to doomscroll on twitter 8+ hours a day. in today's world, distribution is king. but every platform (X, instagram, tiktok, etc) has a different algorithm to go viral. for example: > x - look for REPLIES (27x higher boost than likes) > instagram - promotes high quality/edited videos; retention is driver > tiktok - rewards less formal/more raw content (why ugc does so well) that's why we hired a 19 year old to study what makes each piece of content go viral. everything from reading twitter's algorithms to staying updated on the latest memes. tech twitter is a tiny bubble, so to get millions of impressions, you need to adapt to what grabs attention. cracking the algorithm is the key to distribution.
Tanay Kothari tweet media
English
179
69
1.9K
266.5K
Divine
Divine@cdivine304·
@mehaksarmaa Two fell, but one was instantly received into heaven.
English
0
0
3
15K
Divine
Divine@cdivine304·
@elliotarledge this post probably would be within 140 chars if there was a limit
English
0
0
0
97
Elliot Arledge
Elliot Arledge@elliotarledge·
"should I learn CUDA?" is a question i, everyone and their mother is faced with today (yes, even me). heres my most down to earth answer which considers my experience, and what it has vs has NOT brought me success in. ill also talk about where the ecosystem is going and how to play strategically around that. i was just through the pytorch and nanoGPT phase of my journey and got pumped up when karpathy released llm.c. it looked cool (and fast) with such absurd complexity to a complete beginner (me). just wanted to understand a little bit more and quickly realized i would have to further rewire my brain (again). decided to go document my learnings on what a kernel was after prompting GPT-4 about how the whole repo was structured. ended up thinking the same thing to myself about the previous course I built about llms (published before my cuda course) which was that this stuff wasnt easy for me and wont be easy for anyone else. so i kept going, prompting my way through every detail, piece of text in the kernels i saw, looking at all the videos on explaining kernels i could find. eventually figured out the way gpu programming was done (init data on cpu and gpu, move it to gpu, define kernel params like grid/blocks/threads, launch kernel on gpu, get results, move back to cpu, print and visualize stuff). i want to quickly remind you this is all curiousity, eventually seeing stuff isn't taught well enough, and wanting to do it myself. this was never about getting a job (even though i expected some offers to emerge as a results of making a free course on it). there are many rabbitholes to go down, and if you have the time to spare as well as curiousity and fire in you, i fully encourage you to go all in. if you made it this far, ask yourself the following: - am I currently in uni or college? how much do I care about grades? - am I comfortable with one of pytorch or JAX? - am I just in this for the money? - am I looking to get a job somehow as quick as possible in the field? - do I care simply about having an impact on the world potentially at a frontier lab? - am I (be honest with yourself) just utterly lost and need something to learn? - am I just seeing CUDA is a cool buzzword people are posting about and I want a part of it? - am I simply curious and CANNOT help myself since this shit is so cool? (the answer is easy for you but with some nuance) these are designed to give you some clarity if you are able to truly reflect on each of them deeply. getting back to it, in dec 2025 (or 2026 if you're reading this later), the ecosystem is evolving so rapidly that it feels like you can't keep up, even when learning at full speed. ill inform you that some concept but not ALL all important. understanding how a server/PC is built is an important skill that i think is very fun (but potentially expensive) to know better. if you stick in the software realm only, knowing what ram, vram, cpu vs gpu, and these basic terms are essential. going a level deeper, knowing what the computations look like for a neural net (CNN or transformer) is going to serve you very well and is one of the most magic parts of your learning journey. when you get to how those computations are optimized on specific hardware like a hopper or blackwell gpu, it can get a bit scary. theres a lot of material to cover, and you may not know if it will remain relevant. the most concrete example i can give is when if and when you decide to pick up cuda or gpu progrmaming, you'll likely write a kernel in a .cu file with __global__ at the start. this is not how modern kernel writing is done anymore (for the most part). all the deep learning kernels are very optimized today, and techniques like RL training LLMs to speed them up even further is an area of research that's doing well. we also have abstractions like triton, but you'll still need to know cuda moderately well to understand how to get the best use of this since its tiled gpu programming that acts as a way to simplify the workflow for someone who may have come from cuda. nvidia has cutlass, cute, cuda-tile, cute-dsl and many other open source repos coming out which simplifies the kernel writing process further (cute-dsl being the source code of flash attention 4 and the fastest MoE implementation -- sonic MoE). to answer the question "should I learn CUDA?", many of our abstractions today rely on first principles which emerge from cuda originally. theres simply no shortcutting it. if you are committing to kernels, you go all the way. its fine to dabble and explore around the corners a bit to know how deep you actually want to get, but making the fastest deep learning kernels faster is ambitious and unrealistic given that this process will likely be fully automated in a year or so. knowing how to use tools to generate the fastest kernels is a great skill, but optimizing them yourself may not be the best use of your time unless this is your true destiny in some way (idk who decides that lol) i know you didn't ask, but i should mention here that im writing a cuda textbook for deep learning specifically. i chose to write this to give people a bigger piece that combines the essentials on the low level stack. i dont want to spoil it but it doesn't go into triton or any fancy stuff at all. its all the essentials that aren't going anywhere for a while, and are arguably needed even if you aren't working specifically on kernels all the time. there is still a point in learning some of these skills, but just enough so you can make the existing tools work for you. experts built these to solve their own pain points, and they knew it would help other engineers when they stumbled into such tools. when u have a min, spin up a new llm conversation and get it to help you reach a personalized consensus point on if you should learn CUDA or not. sources for stuff i talked about are in replies.
English
22
24
432
48K