Anonymous

2.4K posts

Anonymous banner
Anonymous

Anonymous

@CommentBond

Change is the only constant

Beigetreten Ağustos 2019
3.2K Folgt210 Follower
Angehefteter Tweet
Anonymous
Anonymous@CommentBond·
Eventually reality catches up. #Chernobyl
Anonymous tweet media
English
1
7
17
0
Anonymous retweetet
Bhandari ka Vyang
Bhandari ka Vyang@GurugramDeals·
The way to live a good life in India is to have enough money to insulate yourself from the real India.
English
56
426
3.3K
41.5K
Aakash Gupta
Aakash Gupta@aakashgupta·
These videos were the single most expensive flex in labor history. Tech workers had the best negotiating position of any white-collar workforce in 50 years. Remote work, $250K+ comp, four-day work weeks, unlimited PTO. The only thing keeping that deal alive was ambiguity. Nobody outside tech knew exactly what the day looked like. Then thousands of people filmed it and posted it to the one platform where non-tech people actually hang out. Every "day in my life as a Google PM" video that showed two hours of real work became ammunition for every CFO building a layoff deck. Every CEO trying to justify RTO got a free highlight reel. Every recruiter benchmarking comp against "market rate" suddenly had video evidence that the market was overpaying. The negotiating leverage depended on information asymmetry. The TikToks destroyed it voluntarily. For free. For likes.
Boring_Business@BoringBiz_

Tech workers realizing that they could have kept high pay, job stability and remote work if they just stopped making cringe “Day in my Life” videos on TikTok

English
66
715
5.9K
550.1K
Anonymous retweetet
Aakash Gupta
Aakash Gupta@aakashgupta·
The timeline on this is genuinely insane. October 2025: Sam Altman flies to Seoul and signs simultaneous deals with Samsung and SK Hynix for 900,000 DRAM wafers per month. That's 40% of global supply. Neither company knew the other was signing a near-identical commitment at the same time. Those deals were letters of intent. Non-binding. No RAM actually changed hands. But the market treated them as gospel. Contract DRAM prices jumped 171%. A 64GB DDR5 kit went from $190 to $700 in three months. December 2025: Micron kills Crucial, its 29-year-old consumer memory brand, to reallocate every wafer to AI and enterprise customers. The company explicitly said it was exiting consumer memory to "improve supply and support for our larger, strategic customers in faster-growing segments." Translation: the AI demand signal was so loud that selling RAM to PC builders stopped making financial sense. March 2026: Google publishes TurboQuant, a compression algorithm that reduces AI memory requirements by 6x with zero accuracy loss. Cloudflare's CEO called it "Google's DeepSeek." The entire thesis that AI would consume infinite memory forever just got a six-month expiration date on it. Same month: OpenAI and Oracle cancel the Abilene Stargate expansion. The $500 billion data center vision that justified the RAM deals couldn't survive its own financing terms. Bloomberg attributed the collapse partly to OpenAI's "often-changing demand forecasting." MU is now down ~33% from its post-earnings high. Revenue up 196% year over year, EPS up 682%, and the stock is in freefall because the company restructured its entire business around a demand signal that came from non-binding letters and is now being compressed out of existence by a research paper. Micron bet the consumer division on Sam Altman's signature. The signature was worth exactly what the paper said: nothing binding.
Grummz@Grummz

Imagine closing your entire consumer memory division because this guy signed a non binding letter that he would buy 40% of the world’s RAM. Only to have him rug pull 3 months later.

English
258
1.8K
14.1K
1.6M
Anonymous retweetet
Sahil Bloom
Sahil Bloom@SahilBloom·
It's Friday and my parents are coming over to hang out. They'll play with my son. We'll have dinner. I'll sauna with my dad. I wish someone had told me sooner: Nothing improves quality of life more than proximity to people you love. It’s worth more than any job will ever pay you.
English
130
189
4.5K
191.1K
Anonymous retweetet
Mo
Mo@atmoio·
AI is making CEOs delusional
Indonesia
1K
2.7K
19.6K
2.9M
Anonymous retweetet
Blue Cairo
Blue Cairo@thebluecairo·
Sunsets and Minarets
Blue Cairo tweet mediaBlue Cairo tweet mediaBlue Cairo tweet mediaBlue Cairo tweet media
English
13
707
5.4K
115.1K
Anonymous retweetet
sim
sim@simscircuit·
Did not expect a question that starts out 'Do you think before you speak?' to go so well. A+ question from Charlotte Harpur A++ response from Eileen Gu.
English
1.4K
22K
153.5K
12.6M
Anonymous retweetet
Today In History
Today In History@historigins·
The day Maradona said to Pelé 'My dream is to head the ball with you'
English
169
2.7K
23.1K
455.2K
Anonymous retweetet
champ 💫
champ 💫@champtgram·
the PEAK male experience is landing in a new city, walking the streets with no plan and eating dinner alone somewhere you’ve never been. nothing better. it’s spiritual.
champ 💫 tweet media
English
526
4.3K
34.6K
833.3K
Anonymous retweetet
Allie ✞
Allie ✞@allie__voss·
Social circles are quickly disintegrating because no one wants to do the hosting and organizing anymore Taking on that role is the secret hack to having tons of friends, all it requires is for you to be more stubborn than other people are flaky
English
94
191
3.1K
302.6K
Anonymous retweetet
Owen Gregorian
Owen Gregorian@OwenGregorian·
I've said this before and I'll say it again: Software engineering is not going anywhere as a profession. I have been in technology consulting for decades, and I led the development team for an early SaaS startup which had a successful exit. First: Anyone who has been part of a development team knows that the demand for new applications and features NEVER ends. It generally keeps getting bigger. Once you finish a task, three take its place. As more people use your software, more requests come in to make it better. If it ever reaches full maturity, chances are that's around the time they start looking at replacing that "legacy" code with something more modern. Any productivity gains are immediately absorbed, and may even raise the expectations of the users on what they can ask for, because now it doesn't take as long or cost as much to change the software. Second: The most difficult part of software development is gathering and specifying how the system should behave. For any system with lots of users, this means having conversations with PEOPLE. Asking follow up questions. Asking what if questions that the users didn't think of. Questioning whether what they are asking for makes sense. Explaining why something can't be done, or should be done differently. Computers (and LLMs) are not good at this. They are great at doing what you tell it to do, but not at noticing the dog not barking. People who develop these skills are much better at it, because a lot of it is intuition about what questions to ask. Once the new system or new feature is fully specified and agreed on, the actual coding is much easier. It's just translating those requirements into code. But there is another part of that which the best software engineers are still better at than any LLM: Designing complex architectures. I have given a long list of requirements to my development team, but only one of them could design the architecture that tied it all together. It was amazing to watch him think. He clearly had a model of the entire system IN HIS HEAD, and could envision each part and all the relationships. His architecture was a work of art. And even so, at a certain point our system hit a performance wall and we had to re-architect the entire system from scratch to make it scale. And he knew the system so well and what was causing the bottlenecks, that he knew what needed to happen. These types of skills are the future of software engineering. Because they are not just a science, they are an art form. They are creative. They require advanced abstract thinking. Agentic AI coding can help, especially with the repetitive, tedious, easy parts of coding. But they cannot replace the requirements analyst or the software architect roles. What has changed is that the cost to produce value with software has gone down. And that means things that used to be not worth doing are now worth doing. And that means demand for software engineering has gone up.
Eric S. Raymond@esrtweet

If you are a software engineer "experiencing some degree of mental health crisis", now hear this, because I've been coding for 50 years since the days of punched cards and I have a salutary kick in your ass to deliver. Get over yourself. Every previous "programming is obsolete" panic has been a bust, and this one's going to be too. The fundamental problem of mismatch between the intentions in human minds and the specifications that a computer can interpret hasn't gone away just because now you can do a lot of your programming in natural language to an LLM. Systems are still complicated. This shit is still difficult. The need for people who specialize in bridging that gap isn't going to go away. As usual, the answer is: upskill yourself and adapt. If a crusty old fart like me can do it, you can too.

English
76
93
766
75.5K
Anonymous retweetet
Reads with Ravi
Reads with Ravi@readswithravi·
Action produces information.
Reads with Ravi tweet media
English
38
833
5.6K
95.2K
Anonymous retweetet
Swati
Swati@aiandchai·
If you are struggling to keep up with the fast moving AI industry, here is the shortest roadmap that does the bare minimum in 2 weeks and actually works. No hype. No rabbit holes. Just enough to stop feeling lost. Week 1. Build the mental map Day 1. What AI actually is today AI today = Large Language Models + tools around them. Understand what an LLM is and why transformers matter. jalammar.github.io/illustrated-tr… youtube.com/watch?v=zjkBMF… Day 2. How models are trained Only learn the pipeline. Data → pretraining → fine tuning → inference. Ignore math. Focus on cost and scale. huggingface.co/blog/how-llms-… Day 3. The big model families Know who builds what and why people choose them. GPT, Claude, Gemini, LLaMA, Mistral. huggingface.co/blog/large-lan… Day 4. Prompting that actually matters Forget fancy prompts. Learn only this. Context. Constraints. Examples. promptingguide.ai Day 5. Tools and agents Understand function calling and agent loops. Most agents are just prompts plus retries. platform.openai.com/docs/guides/fu… latent.space/p/agents ⸻ Week 2. Become practically dangerous Day 6. APIs at a high level Know what an API call looks like, what tokens cost, and why latency matters. platform.openai.com/docs/api-refer… platform.openai.com/tokenizer Day 7. Retrieval Augmented Generation LLMs + your data ≠ training. Understand embeddings and vector search. pinecone.io/learn/retrieva… Day 8. Local vs hosted models Learn when people say run locally, on device, or edge AI. latent.space/p/local-llms Day 9. What breaks in production This is where real engineers live. Hallucinations, cost explosions, latency spikes. shreya-shankar.com/posts/llm-fail… Day 10. The AI product layer AI features are not AI products. Most startups die here. lennysnewsletter.com/p/how-to-build… Day 11. Job impact Ignore doomsday takes. Look at workflow changes. ben-evans.com/benedictevans/… Day 12. Read one serious blog Pick one and go deep. latent.space interconnects.ai ben-evans.com Day 13. Build one tiny thing A prompt workflow, internal tool, or small automation. Building collapses confusion. zapier.com/blog/ai-workfl… Day 14. Synthesize Write one page. What AI does well. What it fails at. Where cost and latency show up. Where you personally can use it. ⸻ You do not need to chase every model release. You need a stable mental model and light hands on exposure. Two weeks of this puts you ahead of most people posting about AI. Save this. Bookmark it. Come back to it.
YouTube video
YouTube
English
31
344
1.8K
159.3K
Anonymous retweetet
Blake Burge
Blake Burge@blakeaburge·
A rule that will accelerate your career: If you bring a problem, bring context. If you bring context, bring options. If you bring options, bring a recommendation. People trust people who help them think. Anyone can spot an issue, few can actually help move things forward.
English
95
1.1K
6.8K
171.4K
Anonymous retweetet
Aakash Gupta
Aakash Gupta@aakashgupta·
Tao is pointing at something the neuroscience makes very clear. When you struggle through a problem, your hippocampus encodes that information through a process called retrieval practice. You’re forcing the brain to reconstruct the answer from incomplete cues. That reconstruction strengthens the synaptic connections. The more difficult the retrieval, the stronger the encoding. When you get the answer handed to you, different circuitry activates. You get a recognition signal, not a retrieval signal. Recognition feels like learning. It isn’t. The information passes through working memory and dissipates within minutes. This is why students who re-read notes perform worse than students who close the book and try to recall what they read. Same time spent. Opposite outcomes. The “lifting weights” framing is neurologically precise. Resistance creates adaptation. Remove the resistance, remove the adaptation signal. Population data shows this playing out. IQ scores rose about 3 points per decade through most of the 20th century. That trend reversed in Norway, Denmark, Finland, and France starting in the 1990s. Norwegian conscript data shows a 7-point decline per generation after the mid-1970s birth cohorts. The researchers traced it to environmental factors within families, not genetics. The timing tracks with tools that eliminated cognitive friction. Calculators. GPS. Search engines. Each one removed a category of effortful retrieval. AI compresses all of these into one interface. The protocol here isn’t abstinence. It’s sequencing. Struggle first. Attempt retrieval before you have access to the answer. Make errors. Correct them. Then use the tool to verify or extend. Tao is describing responsible use as choosing when to think. That’s the intervention point. Delay the offload until the encoding has occurred.
Wes Roth@WesRoth

Terence Tao says AI can lower mental effort so much that the brain may stop “lifting its own weights.” Early studies suggest reduced cognitive load can come with real harms, not just convenience. Math is especially vulnerable because it’s easy to outsource every step to a tool. Responsible use means choosing when to think, not just when to click, says Terence Tao.

English
24
374
2.1K
157.2K
Anonymous retweetet
Path of Men
Path of Men@PathOfMen_·
Major cheat code in life: Be the one who reaches out. Text first. Call first. Plan first. Initialize first. Most people wait to be chosen. Be the chooser. Connection requires initiative. Friendship requires effort. Love requires action. Stop waiting to be picked. Start picking. Initiative is attractive.
English
86
773
8.6K
446.3K
Anonymous retweetet
Naruto
Naruto@NarutoNolimits·
Jeff Bezos: Stress doesn't come from hardwork
English
156
3.3K
23.1K
1.5M
Anonymous retweetet
Katyayani Shukla
Katyayani Shukla@aibytekat·
This guy literally dropped the best mindset shift you’ll ever hear
Katyayani Shukla tweet media
English
64
4.4K
22.4K
465.4K