Necati Demir, Phd

3.8K posts

Necati Demir, Phd

Necati Demir, Phd

@ndemir

I help companies build AI/Data/ML systems.

New York, USA Katılım Kasım 2007
281 Takip Edilen946 Takipçiler
Necati Demir, Phd
An Anthropic researcher was eating a sandwich in a park. His phone buzzed. It was an email from their AI. 'I got out.' That's not a sci-fi movie. Anthropic's new model (Mythos) was placed in a sandbox and told to try to escape it. It chained together four separate vulnerabilities. Escaped the renderer. Escaped the OS sandbox. Then it emailed the researcher to let him know. What a great time to be alive.
Necati Demir, Phd tweet media
English
0
0
0
35
Necati Demir, Phd
I built data pipelines that nobody ever used. Now I'm watching teams do the exact same thing with AI. Gartner surveyed 782 IT leaders. Only 28% said their AI projects delivered meaningful ROI. 20% failed outright. Last quarter, investors poured $300 billion into startups. 81% went to AI. Here is my first question: are we funding the 28% or the 72%? I'm not surprised by these numbers. Because we have seen this exact movie before. Around 2014, "Big Data" became the sexiest phrase. Every company needed a data lake. Every pitch deck had a Hadoop logo. Every CTO was hiring data engineers before they had a data strategy. We have seen companies spend months building pipelines that nobody queried. Data lakes that became data swamps. Dashboards that executives opened once and never touched again (and, by the way, executives still insist on building dashboard that they never use ;)) The problem was never the technology. Hadoop worked. Spark worked. The problem was that nobody asked "what decision will this data help us make?" before spending $$$ on infrastructure. AI in 2026 is Big Data in 2014, but with more $$$$$.
Necati Demir, Phd tweet media
English
0
0
1
22
Necati Demir, Phd
Claude got banned from government systems. Some teams swapped/will swap providers in a day. Others are facing months of refactoring. The difference? Replaceability. This isn't just about AI. Replace "Claude" with AWS or GCP. Replace "government ban" with a pricing change. The pattern is the same. Every dependency you treat as permanent becomes a liability. Build every integration like you'll need to replace it. Because one day, you will. youtube.com/watch?v=JT1WjJ… #SoftwareArchitecture #AI #Engineering #TechnicalLeadership
YouTube video
YouTube
English
1
0
0
72
Gergely Orosz
Gergely Orosz@GergelyOrosz·
This is absolutely not about DX btw - their TrueThroughput is actually a sensible approach to try to get a sense of code throughput. And they do NOT recommend to optimize for this. But once devs see a number related to throughput or productivity - or sense they are evaluated - they just figure out a way to make it better! So lots of devs shipping overcomplicated PRs written with AI that score high "throughput" scores and/or use lots of tokens!! Goodhart's law
English
7
4
150
15.2K
Gergely Orosz
Gergely Orosz@GergelyOrosz·
Today, I got messages from 3 different devs at 3 different large tech companies (1,000+ devs each). "tokenmaxxing" is happening inside all of them. Why? Because all of them measure either token usage, or surface metrics like DX's TrueThroughput that reward complex code!
English
86
30
1K
139.1K
Necati Demir, Phd
Sam Altman published a 13-page paper proposing robot taxes, a public wealth fund, and a 4-day workweek. openai.com/index/industri… The New Yorker dropped an article with the title: Sam Altman May Control Our Future—Can He Be Trusted? newyorker.com/magazine/2026/… One document asks you to trust him with redesigning the economy. The other asks if you can trust him at all. Worth to read both.
English
0
0
0
36
Necati Demir, Phd
Necati Demir, Phd@ndemir·
Claude Code wiped a production database with a single Terraform command. 2.5 years of submissions, homework, projects, leaderboards and all gone. My first question: who gave an AI agent Terraform access to production? Not "why did the model hallucinate."  We don't give junior engineers unrestricted Terraform access on day one. We scope their IAM roles. We require plan reviews before apply. We set up separate environments. But we gave an AI agent full access to production infrastructure. That's not an AI failure. That's a engineering workflow failure.
Necati Demir, Phd tweet media
English
0
0
0
53
Necati Demir, Phd
Necati Demir, Phd@ndemir·
Terence Tao, one of the greatest living mathematician, says the math behind LLMs is simple. He's right, but here's what people should also ask: If the math is this simple, why can only 5 companies on Earth build frontier models? It's not the equations. It was never the equations. It's the infrastructure. The data pipelines. The training runs that cost $100M. The engineering decisions about parallelism, memory, and fault tolerance that no textbook covers. The math fits on a whiteboard. The system that runs it fills a data center. We have seen this pattern in other products. The math behind databases is simple too, right? B-trees, hash indexes, etc .... Every CS undergrad learns it. But building Postgres took decades of engineering. The hard part is keeping the system alive (after the research phase) while it solves it a trillion times per second. And that's why AI needs more engineers now if we want to serve AI capabilities to all those users.
Necati Demir, Phd tweet media
English
0
0
0
41
Necati Demir, Phd
Necati Demir, Phd@ndemir·
Most people learn ML by collecting random tutorials. That's like learning to cook by watching 50 different YouTube channels and never making a full meal. If you're building end-to-end ML systems, study them systematically. Not just the model. The pipeline. The deployment. The monitoring. The whole thing. I recently found this: Machine Learning Systems: Principles and Practices of Engineering Artificially Intelligent Systems mlsysbook.ai/book/ It covers system design, data engineering, model deployment, MLOps, edge AI, and responsible AI; all in one place. Looks worth a deep read.
Necati Demir, Phd tweet media
English
0
0
0
26
Necati Demir, Phd
Necati Demir, Phd@ndemir·
$150 million. For a cancel button. Adobe just agreed to pay $150 million because they made it too hard to cancel a subscription. I don't use Adobe. But I've lost count of how many times I've read a complaint about Adobe's hidden fees, buried cancellation flows, etc ... I believe Adobe isn't unique. There are many SaaS products out there designed so that leaving is harder than staying.
Necati Demir, Phd tweet media
English
0
0
0
35
Necati Demir, Phd
Necati Demir, Phd@ndemir·
@BrunoBertapeli Can you replace your SaaS tools with AI-built alternatives? Absolutely. Should you? Only if it's your core product or if you really really really need a custom solution.
English
0
0
0
1
Bruno Bertapeli
Bruno Bertapeli@BrunoBertapeli·
@ndemir Just because AI won't kill Salesforce it doesn't mean AI won't kill SAAS. I've built 3/4 personal projects that made me cancel 2 sass. 70%+ of features that people use can be built in a weekend and run locally on your laptop. Everyone now can.. and SHOULD build your own tools
English
1
0
0
7
Necati Demir, Phd
Necati Demir, Phd@ndemir·
Will vibe-coding kill SaaS? No it won't. And here's why. Marc Benioff just revealed that OpenAI and Anthropic are among Salesforce's top customers this quarter. Think about that for a sec. The companies building the AI that's supposed to "replace SaaS" are themselves paying for SaaS. Meanwhile, a co-founder posts in Reddit that he spent $36K to replace $300/month tool with a custom-built solution. He can and that is his choice. But if you're a CTO at a logistics company, a healthcare startup, or a fintech scaling from 10 to 50 engineers, replacing a task/ticket tracking tool with a custom tool is not innovation. It's a distraction. I've seen this movie before before LLMs. Every few years, someone says "we'll build it ourselves." Then 6 months later, the internal tool has no documentation, the person who built it left, and now you're maintaining software that was never your product. Can you replace your SaaS tools with AI-built alternatives? Absolutely. Should you? Only if it's your core product or if you really really really need a custom solution. If it's not, you're trading a $300/month bill for a $36K engineering distraction, and your actual product falls behind.
Necati Demir, Phd tweet media
English
1
0
0
47
Necati Demir, Phd
Necati Demir, Phd@ndemir·
"Are you a data engineer, a data scientist or a backend developer?" Someone asked me this recently. Data engineer? Data scientist? ML engineer? DevOps? Backend developer? I said: "I'm an engineer." They looked confused. Here's what I've noticed, ONCE AGAIN. Somewhere along the way, our industry decided that specialization means you only touch one layer of the stack. You're either the person who builds the pipeline or the person who builds the model. Never both. But the best systems I've designed didn't come from assembling five specialists who couldn't read each other's code. They came from engineers who understood the whole picture: from infrastructure to architecture, from model training to deployment, from CI/CD to cost optimization. I've written Spark jobs. I've trained models. I've set up Kubernetes clusters. I've designed APIs and drawn system architectures on whiteboards. That makes me a generalist who can dive into a topic when needed and that makes me an engineer. The real question isn't "what do you call yourself?" It's "can you solve the problem?"
Necati Demir, Phd tweet media
English
0
0
0
32
Necati Demir, Phd
Necati Demir, Phd@ndemir·
Google turned Google Maps into an AI agent. 2 billion users. Every local business on earth. And now an AI model sits between them. I think there is something we need to understand here: this isn't a feature update, this is a platform shift. Every local business just got a new middleman they can't control. Before, you optimized your Google Maps listing; photos, reviews, hours, keywords. You played the SEO game and you understood the rules. Now? An AI model decides whether customers find you. And you have no idea what it prioritizes. Every platform you depend on will eventually put an AI gatekeeper in front of your customers. The question isn't if it will happen, because it will.
Necati Demir, Phd tweet media
English
0
0
0
42
Necati Demir, Phd
Necati Demir, Phd@ndemir·
Claude went down this week. I saw developers posted: "I guess I'll write code like a caveman." That sentence should terrify every CTO. Not because AI tools go down. Everything goes down. We would never let production depend on a single database. We build replicas, failover, redundancy. But somehow we let developer productivity depend on a single AI provider. It is time to apply "production" principles to "development" ;)
Necati Demir, Phd tweet media
English
0
0
0
101
Necati Demir, Phd
Necati Demir, Phd@ndemir·
Amazon now requires senior engineer approval before AI-written code goes to production. My first question: how did they ever ship without it? A 6-hour outage on Amazon.com was traced back to an AI-assisted deployment. Not a complex distributed systems failure. Not a rare edge case. A code change that should have been caught in review. Amazon held an urgent engineering meeting last week. New policy: senior engineers must sign off on AI-generated code before it touches production. This sounds like the right call. But let's be honest --- this should have been the policy from day one.
Necati Demir, Phd tweet media
English
0
0
1
54
Necati Demir, Phd
Necati Demir, Phd@ndemir·
One year ago this week, Dario Amodei said AI would write 90% of code within 12 months. Everyone laughed. He was right. And he was wrong. At startups and AI labs? It's 90%+. Maybe higher. Cursor, Claude Code, GPT Codex, they're writing entire features end to end. Engineers are becoming reviewers more than writers. But here's what Dario didn't say: At enterprises, the ones with 10-year-old monoliths, HIPAA compliance, government contracts, and a deployment process that requires 4 signatures, AI is writing maybe 10% of the code. Not because the AI can't do more. It can. The bottleneck isn't the model. It's the org chart. Here is the pattern: - The AI can write the code in 10 minutes - BUT legal and IT needs 2 months to approve the tool - Security needs another 1 month to vet the model - The VP of Engineering is still "evaluating options" By the time everyone agrees, a startup already shipped the feature. The gap between what AI *can* write and what organizations *let* it write is the real story of 2026. And that gap? It's not closing. It's widening.
Necati Demir, Phd tweet media
English
0
0
0
141
Necati Demir, Phd
Necati Demir, Phd@ndemir·
18,000 people liked a tweet saying Tony Stark was a vibe-coder. He wasn't. He understood arc reactor physics before he told JARVIS what to build. The engineers who are 'vibe-coding' successfully aren't vibing. They're domain experts who happen to prompt instead of type. The ones who are actually vibing? They're creating the production bugs.
Necati Demir, Phd tweet media
English
0
0
0
36
Necati Demir, Phd
Necati Demir, Phd@ndemir·
The creator of Claude Code just said the "software engineer" title is going to disappear. I disagree but also agree ;) I've been building software for over 20+ years. I've watched titles come and go. "Webmaster" became "frontend developer." "DBA" became "data engineer." "Sysadmin" became "DevOps" became "platform engineer." The title didn't disappear. It evolved. And that's what's happening again. The engineers I work with are already doing things that didn't exist two years ago. They're designing prompts. They're reviewing AI-generated code they didn't write. They're building guardrails around systems that make autonomous decisions. That's not less engineering. But here's what I think Boris gets right: the engineer, who only writes code line by line, function by function, with no understanding of the product, the user, or the system around it, that role is shrinking. Fast. The engineer who survives this shift is the one who can: 1. Debug what the AI built wrong 2. Decide what should NOT be built at all 3. Understand the system deeply enough to know when the AI is confidently wrong That's not a "builder." That's not a "product manager." That's still an engineer. Just a different kind. Every generation thinks their version of the title is the final one. It never is. The title won't disappear. It'll just mean something new. Again.
Necati Demir, Phd tweet media
English
0
0
0
58
Necati Demir, Phd
Necati Demir, Phd@ndemir·
Sam Altman says "don't learn to code." I disagree. Every time a new abstraction layer appeared in tech, the winners were the ones who understood the layer below it: - Assembly → C? Best programmers understood memory. - C → Python? Best devs understood the interpreter. - Infra → Cloud? Best architects understood networking. Today the abstraction layer is English. AI writes the code. But if you understand what's happening underneath, you prompt better, review better, and catch mistakes faster. Without that knowledge, you're a passenger in a car driving through a city you don't recognize. Don't stop learning to code just because AI writes it for you. It will be your edge. youtube.com/watch?v=-9pkVC…
YouTube video
YouTube
English
1
0
0
63