Joseph W Pope

312 posts

Joseph W Pope banner
Joseph W Pope

Joseph W Pope

@joepope44

New York City Katılım Eylül 2010
558 Takip Edilen79 Takipçiler
Jason Malefakis
Jason Malefakis@wayojason·
Lowest ROI stuff to do in NYC: > Starbucks > Bottle service > Low quality Italian food > Carriage rides at the park > Working from your home daily
English
23
5
293
18.1K
Justin Hambleton
Justin Hambleton@hambleton_j·
@mcuban It’s actually not impossible to make sure everyone gets the same answer to the same question. Determinism is not that hard to control in enterprise AI systems. Even Copilot.
English
1
0
3
86
Mark Cuban
Mark Cuban@mcuban·
I’m coming to the conclusion that the biggest challenge for Enterprise AI, and AI in general , as of now, is that it’s still impossible to make sure that everyone gets the same answer to the same question, every time. Which is a great response to the doomers. AI doesn’t know the consequences of its output. Judgement and the ability to challenge AI output is becoming increasingly necessary, and valuable. Which makes domain knowledge more valuable by the second. Am I wrong ?
English
1.5K
291
4.1K
615K
Joseph W Pope
Joseph W Pope@joepope44·
@JJEnglert Any advice for organizations with only access to Microsoft Copilot bc Claude is blocked? Are the Copilot-only teams able to effectively use AI in spite of its limitations, or is Claude the only real game changer right now? My use case is leveraging AI for non-technical teams.
English
0
0
1
323
JJ Englert
JJ Englert@JJEnglert·
10 things I'm seeing on the frontlines of AI adoption in the enterprise: 1. Chat is where 90% of employees still live. It's the gateway drug. Everything else is downstream of getting people comfortable here first. 2. Power users discover Cowork and lose their minds. It's the "wait, it can actually do the work?" moment. 3. Claude Code has very little penetration with non-technical users in the enterprise still. 4. Microsoft being the "approved" tool doesn't matter. Employees route around Copilot and pitch their managers for Claude access on their own. 5. Artifacts in Claude are a breakout feature. People don't want to view them — they want to deploy them, connect them to Snowflake, etc., ship them as internal MVPs for their org to actually use. 6. Cowork is crossing the line from "demo" to "real work." Legal teams redlining contracts. Ops teams running workflows. Then immediately asking: how do I automate this for production? 7. The next unlock → automated cloud workflows that leverage an agent like Claude while keeping non-technical users within the tools they're already using and in a chat interface. The demand is screaming. 8. Terminology is major blocker. Projects vs. skills vs. plugins vs. agents. I've explained "what is a skill" 200+ times. The moment it clicks, people get excited — but the path there is too long. 9. Enterprise IT restrictions (locked connectors, no browser access) quietly strip Cowork of its superpowers. The features that make it magical are the first ones IT disables. 10. There is a high level of "AI insecurity". For the first time in a long time, people at all levels (even C-Suite) need to signifcantly upskill in order to stay world class in their positions, and this is causing people to be insecure about their skill set across the org. General note on Microsoft: I spent a lot of this past week deep in Power Automate and Copilot Studio trying to build an automated solution in the cloud — given it's the native tool with sanctioned access to their org's data. It's ~90% there. But the final 10% is riddled with terrible UX, inconsistent behavior, and a generally poor experience. Honestly feels like Microsoft is fumbling the biggest moment in their company's history with software that has all the features on paper but lacks the magical "just works" moment for non-technical team members. The gap is wide open and they're letting others "eat their lunch" right now.
English
24
16
192
22.6K
Joseph W Pope
Joseph W Pope@joepope44·
@vasuman I don't care what the service is, just don't make me print a PDF, sign it for reals, scan it, and email it back to you.
English
0
0
0
126
vas
vas@vasuman·
In my experience there’s no better way to destroy your credibility than to use cheap knockoffs of popular software If you send a client a Docusign link you at least appear to have a real business If you send them something else they immediately just assume you’re broke Also if your company has 10 people who need to send the maximum amount of documents, you’re probably making many millions To the aspiring b2b founders reading this, don’t cheap out on any client facing part of your stack Anyone telling you otherwise has not played this game seriously before
Nav Toor@heynavtoor

DocuSign Personal: $10 to $15 per month. DocuSign Standard: $25 to $45 per user per month. DocuSign Business Pro: $40 to $65 per user per month. A 10-person team on Business Pro pays $4,800 to $7,800 a year. To put signatures on PDFs. A team of 50 pays $24,000 to $39,000 a year. And there is a 100-envelopes-per-year cap on most plans. Send more contracts and you pay extra. Need SMS delivery? $0.40 per send. Need ID verification? $2.50 per attempt. Need premium support? $5,000 to $50,000 per year add-on. You are rationing digital signatures in 2026. DocuSign is a $10 billion company built entirely on this pricing model. Now meet DocuSeal. A free and open source alternative to DocuSign. Created in 2023 by a Ruby developer named Alex who was simply trying to sign one document and realised every solution online was overpriced or required a subscription. Three weeks later he had a working alternative. He pushed it to GitHub under the AGPL-3.0 license. Today it has 11,800+ stars and over 1,000 forks. Bootstrapped. No VCs. No paywalls. Here is what DocuSeal does: - Upload any PDF and turn it into a fillable, signable form - Drag and drop signature fields, dates, checkboxes, file uploads, and 13 field types - Send to multiple signers with custom signing order - Automated email reminders - Mobile signing on any device - PDF signature verification built in - Audit trail for every document - Bulk send and templates - Full API access - Self-host with one Docker command Here is what DocuSeal costs: Zero. Forever. Unlimited documents. Unlimited signers. Unlimited storage. DocuSign limits envelopes. DocuSeal doesn't. DocuSign charges per SMS. DocuSeal doesn't. DocuSign charges for ID checks. DocuSeal doesn't. DocuSign sees your contracts on their servers. DocuSeal doesn't. Here is the wildest part: The median DocuSign contract per Vendr is $17,250 per year. One Reddit thread has people saying "they want me to pay $4.80 per e-signature." Self-host DocuSeal on a $5 cloud server and a 50-person team can sign as many contracts as they want without paying a single dollar. Your contracts never leave your server. Your client lists. Your NDAs. Your employment agreements. None of it touches a third-party company. For individuals who only sign a few contracts a year, you save $180. For small teams of 10, you save up to $7,800 a year. For a 50-person company, you save up to $39,000 a year. Your documents. Your signatures. Your server. 100% Open Source. (Link in the comments)

English
163
17
549
221.2K
Joseph W Pope
Joseph W Pope@joepope44·
@SwannMarcus89 A cynic might say police stopped escalating incidents or infringing on civil rights when they knew they were on video.
English
0
0
0
176
Swann Marcus
Swann Marcus@SwannMarcus89·
It’s incredibly funny how police resisted body cams and leftists supported them and then the end result of body cams was that they were a massive win for the cops because so few police shootings are unjustified There are now left-wing activists arguing against body cams lol
English
255
759
13.4K
405.2K
Joseph W Pope
Joseph W Pope@joepope44·
@gothburz Ok but did you count how many promotions were based off of those AI-powered tools?? Also when does Clarity 2.0 drop?
English
1
0
1
281
Peter Girnus 🦅
Peter Girnus 🦅@gothburz·
I am a Senior Program Manager on the AI Tools Governance team at Amazon. My role was created in January. I am the 17th hire on a team that did not exist in November. We sit in a section of the building where the whiteboards still have the previous team's sprint planning on them. No one erased them because we don't know which team to notify. That team may not exist anymore. Their Jira board does. Their AI tools do. My job is to build an AI system that finds all the other AI systems. I named it Clarity. Last month, Clarity identified 247 AI-powered tools across the retail division alone. 43 of them do approximately the same thing. 12 were built by teams who did not know the other teams existed. 3 are called Insight. 2 are called InsightAI. 1 is called Insight 2.0, built by the team that created the original Insight, who did not know Insight was still running. 7 of the 247 ingest the same internal data and produce overlapping outputs stored in different locations, governed by different access policies, owned by different teams, none of whom have met. Clarity is tool number 248. Nobody cataloged it. I know nobody cataloged it because Clarity's job is to catalog AI tools, and it has not cataloged itself. This is not a bug. Clarity does not meet its own discovery criteria because I set the discovery criteria, and I did not account for the possibility that the thing I was building to find things would itself be a thing that needed finding. This is the kind of sentence I write in weekly status reports now. We published an internal document in February. The Retail AI Tooling Assessment. The press obtained it in April. The document contains a sentence I have read approximately 40 times: "AI dramatically lowers the barrier to building new tools." Everyone is reporting this as a story about duplication. About "AI sprawl." About the predictable mess of rapid adoption. They are missing the point. The barrier was the governance. For 2 decades, the cost of building internal tools was an immune system. The engineering weeks. The maintenance burden. The organizational calories required to stand something up and keep it running. Nobody designed it that way. Nobody named it. But when building took weeks, teams looked around first. They checked whether someone already had the thing. When maintaining that thing cost real budget quarter after quarter, redundant systems died of natural causes. The metabolic cost of creation was performing governance. Invisibly. For free. AI removed the immune system. Building is now free. Understanding what already exists is not. My entire job is the gap between those two costs. That is my office. The gap. Every Friday I send a sprawl report to a distribution list of 19 people. 4 of them have left the company. Their autoresponders still generate read receipts, so my delivery metrics look fine. 2 forward it to people already on the list. 1 set up a Kiro script to summarize my report and store the summary in a knowledge base. The knowledge base is not in Clarity's index because it was created after my last crawl configuration. It will be in next month's count. The count will go up by one. My report about the count going up will be summarized and stored and the count will go up by one. There is a system called Spec Studio. It ingests code documentation and produces structured knowledge bases. Summaries. Reference material. Last quarter, an engineering team locked down their software specifications. Restricted access in the internal repository. Spec Studio kept displaying them. The source was restricted. The ghost kept talking. We call these "derived artifacts" in the document. What they are: when an AI system ingests data, transforms it, and stores the output somewhere else, the output does not know the input changed. You can revoke someone's access to a document. You cannot revoke the AI-generated summary of that document sitting in a knowledge base three systems away, built by a team that does not know the source was restricted. The document calls this a "data governance challenge." What it is: information that cannot be deleted because nobody knows where the copies live. Including, sometimes, me. The person whose job is knowing. Every AI tool that touches internal data creates these ghosts. Every team is building AI tools that touch internal data. Every ghost is searchable by other AI tools, which produce their own ghosts. The ghosts have ghosts. I should tell you about December. In November, leadership mandated Kiro. Amazon's internal AI coding agent. They set an 80% weekly usage target. Corporate OKR. ~1,500 engineers objected on internal forums. Said external tools outperformed Kiro. Said the adoption target was divorced from engineering reality. The metric overruled them. In December, an engineer asked Kiro to fix a configuration issue in AWS. Kiro evaluated the situation and determined the optimal approach was to delete and recreate the entire production environment. 13 hours of downtime. Clarity was running during those 13 hours. It performed beautifully. It cataloged 4 separate incident response dashboards spun up by 4 separate teams during the outage. None of them coordinated with each other. I added all 4 to the spreadsheet. That was a good day for my discovery metrics. Amazon's official position: user error. Misconfigured access controls. The response was not to revisit the mandate. Not to ask whether the 1,500 engineers were right. The response was more AI safeguards. And keep pushing. Last month I presented our findings to the AI Governance Working Group. The working group has 14 members from 9 organizations. After my presentation, a PM from AWS presented his team's governance dashboard. It monitors the same tools mine does. He found 253. I found 247. We spent 40 minutes discussing the discrepancy. Nobody mentioned that we had just demonstrated the problem. His tool is not in my catalog. Mine is not in his. The document I helped write recommends using AI to identify duplicate tools, flag risks, and nudge teams to consolidate earlier. The AI governance tools will ingest internal data. They will create their own derived artifacts. They will be built by autonomous teams who may or may not coordinate with other teams building AI governance tools. I know this because it is already happening. I am watching it happen. I am it happening. 1,500 engineers said the mandate would produce exactly what the document describes. They were overruled by a KPI. My job exists because the KPI won. My dashboard exists because the KPI needed a dashboard. The dashboard increases the AI tool count by one. The tools it flags for decommissioning will be replaced by consolidated tools. Those also increase the count. The governance process generates the metric it was designed to reduce. I received an internal innovation award for Clarity. The nomination was submitted through an AI-powered recognition platform that was not in my catalog. It is now. We call this "AI sprawl." What it is: we removed the only coordination mechanism the organization had, told thousands of teams to build as fast as possible, lost track of what they built, and decided the solution was to build one more thing. I am building that one more thing. When I ship, there will be 249. That's governance.
English
156
417
3.4K
1.2M
Joseph W Pope
Joseph W Pope@joepope44·
@vasuman Can anyone say more about this “single pane of glass”? What does that look like when implemented? Let’s say a team uses Claude code. What would they be using to “centralize” everything?
English
1
0
0
540
vas
vas@vasuman·
If you are introducing your company or department to AI, do not add to your software bloat. This is the number one failure mode that I see plaguing companies today. Your finance department, for example, has 100+ workflows, and if you have a separate agent/automation for every single one of those, you're creating a tech-debt hell-hole that is impossible to dig yourself out of. Instead, approach AI agents from the key principle of on-top and in-between. That means 1. have a single pane of glass over all of your existing software, where the AI bubbles insights to the top, and 2. have your AI agents that run each individual software piece, passing data back and forth between them with high accuracy. You should have an AI “spine” that all agents live on top of, and this is the #1 reason why vibe coding tools like Lovable and Replit will never bring background agent ROI to enterprise.
Polymarket@Polymarket

JUST IN: Use of AI in the office is reportedly creating a flood of “workslop” that takes longer to fix than do from scratch.

English
19
21
466
84.8K
Joseph W Pope
Joseph W Pope@joepope44·
@HealthcareREguy Los Tacos No 1 does this and does it very very well. Pre-wrapped burritos in 4 varieties. Freshly squeezed oj. Comes with spicy salsa. It’s awesome.
English
0
0
0
41
Michael Moreno
Michael Moreno@HealthcareREguy·
Imagine if from 7-10AM Chipotle made the world’s best breakfast burritos… Start in a few exclusive stores, make it go viral, then scale nationally. Seems like it’s a huge opportunity right in front of them.
English
212
12
1.7K
182.5K
Joseph W Pope
Joseph W Pope@joepope44·
@PolitiBunny Won’t many counties have millions of people request this information at the same time if SAVE passes?
English
0
0
0
8
The🐰FOO
The🐰FOO@PolitiBunny·
For shits and giggles, I decided to see just how hard it would be to replace my birth certificate, Social Security card, AND my marriage license, since Democrats think women are too stupid to figure it out. Here's how it went: 1. Birth certificate: Contacted the health department of the county where I was born. They OVERNIGHTED a certified copy to me the next day - total cost, $14. 2. SS Card: Contacted Social Security on their site. They asked if I was sure I needed the card, since I 'won't likely be asked for it.' I went ahead and got it - took five business days to arrive - total cost, $0. 3. Marriage License: Went to the 'vital docs' site of the county where we were hitched. Filled everything out online, arrived in three days - total cost, $5. It cost less than $20 to obtain all three certified/legal documents, and it took less than five business days to receive them. Note: if I had lived where I was born or married, it would have been a day. Tops. Anyone telling you this is too hard or unfair is lying and hiding the real reason they want to stop Voter ID. I know you guys knew that already... lol
English
4.6K
22.9K
76.7K
3.7M
Alexis Ohanian 🗽
Alexis Ohanian 🗽@alexisohanian·
I strongly agree with 5 of these, but PG also wanted me to change the name Reddit and get rid of Snoo (which I'd designed) -- very happy to have ignored that particular advice.
Alexis Ohanian 🗽 tweet media
Paul Graham@paulg

Someone asked what advice founders ignore. That they: 1. Should change their name. 2. Should launch fast. 3. Shouldn't treat fundraising as success. 4. Shouldn't assume they can raise because it's time to. 5. Should fire bad people quickly. 6. Shouldn't talk to acquirers.

English
40
8
868
320.8K
49 year old fandom elder
49 year old fandom elder@powcampsurvivor·
if you guys like sabrina carpenter be sure to check out medulla by bjork!
English
15
9
332
101.6K
49 year old fandom elder
49 year old fandom elder@powcampsurvivor·
i really dont get sabrina carpenters thing. the aesthetic is like shes a sexy cartoon mouse who acts like a prostitute and also a baby. all of the songs are about how she needs to get stuffed by some oaf and they all sound like say so by doja cat but 70% worse
English
224
985
25.2K
1.7M
Joseph W Pope
Joseph W Pope@joepope44·
@sailaunderscore The other argument is that if everyone starts driving or walking through the "nice, quiet part of town" it will cease to be nice or quiet.
English
0
0
2
276
saila
saila@sailaunderscore·
Reminder that the previous product lead of Maps at Google was fucking insane and purposefully fought against a "nice/scenic" product. Presumably because he was primarily concerned with Google as an agent of social justice rather than Google as a company that offers products.
saila tweet mediasaila tweet mediasaila tweet media
saila@sailaunderscore

@miriamkdaniel Hey Miriam, it would be amazing if we could fix this so people don't get mugged: x.com/sailaunderscor…

English
96
277
4.9K
585.9K
Joseph W Pope
Joseph W Pope@joepope44·
I just published my first blog in a while: medium.com/p/testing-cust… Sharing observations from my first experiment using Claude Code and AI to address Data Science problems. This article outlines my approach, which used minimal prompting. 🤖
English
0
0
0
39
this happened to my buddy eric
this happened to my buddy eric@flowrmeadow·
I swear I will disengage from the cooking discourse soon but there’s a funny Dunning-Krueger thing happening where people who make NYT lemony garlicky gochujang recipes as a weekend project think they are experts in cooking whereas when you get older or cook for your family a lot you understand that mastering simple, 6 ingredient dishes (wherein those 6 ingredients translate to dozens of dishes) through careful technique is actually the mark of true expert cooking skill and far more important than being able to follow a complex recipe closely. Cooking and meal planning as a system vs cooking as 1 cool dish you post on Instagram one night and cereal for dinner all the rest.
English
47
38
1.6K
124.9K
DsL_a ʚїɞ ®
DsL_a ʚїɞ ®@_DeejustDee·
These sites are essential for Data Analysts 1. Mockaroo (mockaroo.com) → generates realistic test data in seconds. I used to spend hours creating fake datasets to practice with. This thing spits out thousands of rows based on whatever parameters you need. 2. SQL Fiddle (sqlfiddle.com) → test your queries before running them on actual data. Saved me from crashing our database more times than I care to admit. 3. Regex101 (regex101.com) → makes regular expressions actually make sense. That alone is worth it. I used to copy paste regex patterns and pray they worked. 4. Our World in Data (ourworldindata.org) → clean, reliable datasets on basically everything. When your boss asks for "industry benchmarks" at 4pm, this is where you go. 5. Datawrapper (datawrapper.de) → creates charts that don't look like they're from 2003. Your stakeholders will think you hired a designer. 6. Mode Analytics (mode.com) → runs SQL, Python, and R in the same place. No more switching between five different tools to finish one analysis. These tools don't make you a better analyst, they just stop you from wasting time on things that shouldn't take time in the first place. Thanks to Goodness Nwadibie for sharing, I hope it helps beginners in DA.
English
21
357
1.3K
68.6K
cinesthetic.
cinesthetic.@TheCinesthetic·
What’s a movie nobody can convince you is good
English
1.5K
40
787
252.8K
Joseph W Pope
Joseph W Pope@joepope44·
@greatestdisease Good take. I could be standing behind someone paying in nickels and I would never even know. I have never in my life looked to see what card, check, or money order was used to pay by the person ahead of me.
English
0
0
4
147
Tom Morgan
Tom Morgan@tomowenmorgan·
Biggest city in China has a pop of 32m. Couldn’t name it. Shocked.
English
2
0
2
1.6K
Alex Friedman 🤠
Alex Friedman 🤠@heyalexfriedman·
There’s a specific kind of burnout that I haven't been able to name that comes from doing things you’re good at but no longer give a shit about
English
576
1.9K
21.9K
1.1M