Peter Zalman

1.5K posts

Peter Zalman banner
Peter Zalman

Peter Zalman

@peterzalman

I am crafting great ideas into working products and striving for balance between Design, Product and Engineering.

Prague Katılım Haziran 2009
95 Takip Edilen302 Takipçiler
Peter Zalman retweetledi
Peter Girnus 🦅
Peter Girnus 🦅@gothburz·
I am a Senior Program Manager on the AI Tools Governance team at Amazon. My role was created in January. I am the 17th hire on a team that did not exist in November. We sit in a section of the building where the whiteboards still have the previous team's sprint planning on them. No one erased them because we don't know which team to notify. That team may not exist anymore. Their Jira board does. Their AI tools do. My job is to build an AI system that finds all the other AI systems. I named it Clarity. Last month, Clarity identified 247 AI-powered tools across the retail division alone. 43 of them do approximately the same thing. 12 were built by teams who did not know the other teams existed. 3 are called Insight. 2 are called InsightAI. 1 is called Insight 2.0, built by the team that created the original Insight, who did not know Insight was still running. 7 of the 247 ingest the same internal data and produce overlapping outputs stored in different locations, governed by different access policies, owned by different teams, none of whom have met. Clarity is tool number 248. Nobody cataloged it. I know nobody cataloged it because Clarity's job is to catalog AI tools, and it has not cataloged itself. This is not a bug. Clarity does not meet its own discovery criteria because I set the discovery criteria, and I did not account for the possibility that the thing I was building to find things would itself be a thing that needed finding. This is the kind of sentence I write in weekly status reports now. We published an internal document in February. The Retail AI Tooling Assessment. The press obtained it in April. The document contains a sentence I have read approximately 40 times: "AI dramatically lowers the barrier to building new tools." Everyone is reporting this as a story about duplication. About "AI sprawl." About the predictable mess of rapid adoption. They are missing the point. The barrier was the governance. For 2 decades, the cost of building internal tools was an immune system. The engineering weeks. The maintenance burden. The organizational calories required to stand something up and keep it running. Nobody designed it that way. Nobody named it. But when building took weeks, teams looked around first. They checked whether someone already had the thing. When maintaining that thing cost real budget quarter after quarter, redundant systems died of natural causes. The metabolic cost of creation was performing governance. Invisibly. For free. AI removed the immune system. Building is now free. Understanding what already exists is not. My entire job is the gap between those two costs. That is my office. The gap. Every Friday I send a sprawl report to a distribution list of 19 people. 4 of them have left the company. Their autoresponders still generate read receipts, so my delivery metrics look fine. 2 forward it to people already on the list. 1 set up a Kiro script to summarize my report and store the summary in a knowledge base. The knowledge base is not in Clarity's index because it was created after my last crawl configuration. It will be in next month's count. The count will go up by one. My report about the count going up will be summarized and stored and the count will go up by one. There is a system called Spec Studio. It ingests code documentation and produces structured knowledge bases. Summaries. Reference material. Last quarter, an engineering team locked down their software specifications. Restricted access in the internal repository. Spec Studio kept displaying them. The source was restricted. The ghost kept talking. We call these "derived artifacts" in the document. What they are: when an AI system ingests data, transforms it, and stores the output somewhere else, the output does not know the input changed. You can revoke someone's access to a document. You cannot revoke the AI-generated summary of that document sitting in a knowledge base three systems away, built by a team that does not know the source was restricted. The document calls this a "data governance challenge." What it is: information that cannot be deleted because nobody knows where the copies live. Including, sometimes, me. The person whose job is knowing. Every AI tool that touches internal data creates these ghosts. Every team is building AI tools that touch internal data. Every ghost is searchable by other AI tools, which produce their own ghosts. The ghosts have ghosts. I should tell you about December. In November, leadership mandated Kiro. Amazon's internal AI coding agent. They set an 80% weekly usage target. Corporate OKR. ~1,500 engineers objected on internal forums. Said external tools outperformed Kiro. Said the adoption target was divorced from engineering reality. The metric overruled them. In December, an engineer asked Kiro to fix a configuration issue in AWS. Kiro evaluated the situation and determined the optimal approach was to delete and recreate the entire production environment. 13 hours of downtime. Clarity was running during those 13 hours. It performed beautifully. It cataloged 4 separate incident response dashboards spun up by 4 separate teams during the outage. None of them coordinated with each other. I added all 4 to the spreadsheet. That was a good day for my discovery metrics. Amazon's official position: user error. Misconfigured access controls. The response was not to revisit the mandate. Not to ask whether the 1,500 engineers were right. The response was more AI safeguards. And keep pushing. Last month I presented our findings to the AI Governance Working Group. The working group has 14 members from 9 organizations. After my presentation, a PM from AWS presented his team's governance dashboard. It monitors the same tools mine does. He found 253. I found 247. We spent 40 minutes discussing the discrepancy. Nobody mentioned that we had just demonstrated the problem. His tool is not in my catalog. Mine is not in his. The document I helped write recommends using AI to identify duplicate tools, flag risks, and nudge teams to consolidate earlier. The AI governance tools will ingest internal data. They will create their own derived artifacts. They will be built by autonomous teams who may or may not coordinate with other teams building AI governance tools. I know this because it is already happening. I am watching it happen. I am it happening. 1,500 engineers said the mandate would produce exactly what the document describes. They were overruled by a KPI. My job exists because the KPI won. My dashboard exists because the KPI needed a dashboard. The dashboard increases the AI tool count by one. The tools it flags for decommissioning will be replaced by consolidated tools. Those also increase the count. The governance process generates the metric it was designed to reduce. I received an internal innovation award for Clarity. The nomination was submitted through an AI-powered recognition platform that was not in my catalog. It is now. We call this "AI sprawl." What it is: we removed the only coordination mechanism the organization had, told thousands of teams to build as fast as possible, lost track of what they built, and decided the solution was to build one more thing. I am building that one more thing. When I ship, there will be 249. That's governance.
English
156
417
3.4K
1.2M
Peter Zalman retweetledi
a16z
a16z@a16z·
Steven Sinofsky on why it's hard for AI to diffuse through firms: "Algorithmic thinking is really, really, really hard for the vast majority of people who have jobs… If you were to go into any person and ask them to create a flow chart for a particular thing that they have to go do, they would probably fail at producing that flow chart." "So within any organization, say doing a marketing plan… one person probably understands and could document the flow chart. So if you put one of these agents or this coworking tool in front of people… their ability to explain to it what to do is really, really limited." "You're basically just developing the next abstraction layer for how people interact… at each level of the abstraction layer, [it's] been a highly skilled, very specific individual within an organization… and then the little parts they build become little toollets… and some people can stitch together and some can't." @stevesi
a16z@a16z

Box CEO Aaron Levie on the AI Adoption Gap Aaron Levie joins Steven Sinofsky, Martin Casado, and Erik Torenberg to discuss how AI agents will revolutionize work, the growing pains of building software for the agent economy, what Wall Street gets wrong about AI, and more. 00:00 Intro 00:51 Building software for agents vs. humans 02:10 Can non-technical workers actually use AI agents? 14:31 CFO/CIO pushback: the real fear of agents doing integration 18:39 Treating agents like employees and why it breaks down 27:35 Diffusion gap: startups vs. enterprises 42:53 What Wall Street gets wrong @levie @stevesi @martin_casado @eriktorenberg

English
28
76
673
230.6K
Peter Zalman retweetledi
Fiscal.ai
Fiscal.ai@fiscal_ai·
"AI is going to kill software" Meanwhile, at Anthropic... $CRM
Fiscal.ai tweet media
English
143
310
3.9K
477.4K
Peter Zalman
Peter Zalman@peterzalman·
Where else can you remind yourself about the extraordinary life of Frank Gehry and read a few software doomsday references at the same time? linkedin.com/pulse/digital-…
English
0
0
0
13
Peter Zalman retweetledi
Gergely Orosz
Gergely Orosz@GergelyOrosz·
The company that created Claude Code and Claude Cowork must have obviously built their own HR solution from scratch with these tools, right? No: they use Workday. Understand why this is, and you'll understand why enterprise SaaS could be doing better than ever, thanks to AI
English
126
153
3.4K
1.2M
Peter Zalman retweetledi
Kyle Gawley
Kyle Gawley@kylegawley·
If you believe that Sandra from HR of a $10m SaaS is going to vibe code a payroll system you are completely delusional.
English
210
170
5.5K
345.3K
Peter Zalman
Peter Zalman@peterzalman·
Previous tools made execution easier. AI makes output easier. It skips the part where your people develop judgment about what’s worth making in the first place. @beeflo/craft-is-your-only-moat-in-the-age-of-ai-ce14adaeeb5c" target="_blank" rel="nofollow noopener">medium.com/@beeflo/craft-…
English
0
0
0
205
Peter Zalman
Peter Zalman@peterzalman·
The reality is that today’s generative AI is a set of powerful but flawed techniques — they can extend human capability in some areas, but does not replace human expertise, judgment, or creativity. @drpontus medium.com/p/the-real-pro…
English
0
0
1
20
Peter Zalman
Peter Zalman@peterzalman·
"...Even time spent moving from one conference room to another versus easily hanging up a Zoom meeting to jump into another will impact the workflow and productivity...." Here is the new argument for remote work - you can squeeze in more Zoom calls! forbes.com/sites/katewiec…
English
0
0
0
13
Peter Zalman
Peter Zalman@peterzalman·
The need of a shared mutual understanding of what is going to be developed does not go away. New #GenAI tools can help, but at the end, it is about human interaction. medium.com/enterprise-ux/…
English
1
0
1
21
Peter Zalman retweetledi
Jason Koebler
Jason Koebler@jason_koebler·
SCOOP from @samleecole: Leaked Slacks and documents show the incredible scale of NVidia's AI scraping: 80 years — "a human lifetime" of videos every day. Had approval from highest levels of company despite staff legal/ethical concerns: 404media.co/nvidia-ai-scra…
English
113
1.5K
6.4K
4.1M
Peter Zalman
Peter Zalman@peterzalman·
Concerns about accuracy, reliability and security are part of the problem. But more broadly, companies are not quite sure what to do with this new tool. ft.com/content/ff7b0f…
English
0
0
0
11
Peter Zalman
Peter Zalman@peterzalman·
CrowdStrike blames test software for taking down 8.5 million Windows machines and is making improvements to error handling and software rollouts. theverge.com/2024/7/24/2420…
English
0
0
0
23
Peter Zalman
Peter Zalman@peterzalman·
Salesforce has always been a hybrid work company. Our guidelines focus on in-person connection, while also recognizing the value of working away from the office. entrepreneur.com/business-news/…
English
0
0
0
15
Peter Zalman
Peter Zalman@peterzalman·
Companies will have to take a process, simplify the process, automate the process, and apply these solutions. And so, that requires not just technology, but in fact, companies to go do the hard work of culturally changing how they adopt technology. fortune.com/2024/04/26/mic…
English
0
0
0
23
Peter Zalman
Peter Zalman@peterzalman·
Designing new products is easy. Designing brand new web-based user interfaces from scratch is not even a work — it’s fun. link.medium.com/JlIm8aw5yGb
English
0
0
0
30