Sean Bergman

6.8K posts

Sean Bergman banner
Sean Bergman

Sean Bergman

@sbergman

HW PM pivoting to AI builder, crafting #RAG & agentic LLM prototypes. Open-source @ https://t.co/fMgG5BQ6b5. Blog @ https://t.co/nnueDVNZap 🚴 ☕

Portland ↔ Depoe Bay, OR Katılım Aralık 2007
1K Takip Edilen736 Takipçiler
Sabitlenmiş Tweet
Sean Bergman
Sean Bergman@sbergman·
Shipped my first public AI project: LibTrails. 100 classic books, AI-extracted themes, graph-based discovery. Built it to learn. Sharing it to see what others find. libtrails.app
English
1
0
0
60
Jerry Liu
Jerry Liu@jerryjliu0·
Last week we launched LiteParse - a free and fast document parser that provides more accurate AI-ready text than other free/fast parser libraries. It’s a great tool you can plug into assistant agents like Claude Code/OpenClaw and get good results, especially when paired with its screenshotting capabilities. But I do want to note that it doesn’t use any models under the hood (no VLMs/LLMs/even OCR models natively), and it’s not a replacement for VLM-based OCR solutions. It is fast because it is heuristic based! I attached a comparison table below. ✅ It is really good at text extraction and even table extraction, *specifically for LLM understanding*. It will lay the text out in a manner that’s easy for humans/AI to understand. ✅ It is great for assistant coding agents because the agent harness can use its text parsing to do a “fast” step, and then its screenshot capabilities to “dive deep” into a specific page 🚫 It is not great over scanned pages/visuals/anything requiring OCR. We do have OOB integrations with EasyOCR and PaddleOCR 🚫It doesn’t do layout detection and segmentation - it won’t draw bounding boxes over different elements on the page (though it does have word-level bounding boxes!) Tl;dr it’s great for plugging into an AI assistant tool. If you’re trying to OCR a bunch of docs in batch, check out LlamaParse :) LiteParse: github.com/run-llama/lite… LlamaParse: cloud.llamaindex.ai/?utm_source=xj…
Jerry Liu tweet media
Jerry Liu@jerryjliu0

Introducing LiteParse - the best model-free document parsing tool for AI agents 💫 ✅ It’s completely open-source and free. ✅ No GPU required, will process ~500 pages in 2 seconds on commodity hardware ✅ More accurate than PyPDF, PyMuPDF, Markdown. Also way more readable - see below for how we parse tables!! ✅ Supports 50+ file formats, from PDFs to Office docs to images ✅ Is designed to plug and play with Claude Code, OpenClaw, and any other AI agent with a one-line skills install. Supports native screenshotting capabilities. We spent years building up LlamaParse by orchestrating state-of-the-art VLMs over the most complex documents. Along the way we realized that you could get quite far on most docs through fast and cheap text parsing. Take a look at the video below. For really complex tables within PDFs, we output them in a spatial grid that’s both AI and human-interpretable. Any other free/light parser light PyPDF will destroy the representation of this table and output a sequential list. This is not a replacement for a VLM-based OCR tool (it requires 0 GPUs and doesn’t use models), but it is shocking how good it is to parse most documents. Huge shoutout to @LoganMarkewich and @itsclelia for all the work here. Come check it out: llamaindex.ai/blog/liteparse… Repo: github.com/run-llama/lite…

English
7
1
34
3.3K
Sean Bergman
Sean Bergman@sbergman·
Been fighting GPT-5.2 all evening to do pdf datasheet table extraction for a RAG doc ingestion pipeline at work. Needing to migrate from GPT-4o model after it was killed. 5.2 is a complete fail. Luckily IBM's latest docling release nails it the first try. Moving over to that!
English
1
0
1
74
Sean Bergman
Sean Bergman@sbergman·
What happened to @claudeai Code Opus 4.6 today?!? Ouch. Painful experience. Surprised. This is a rare experience
English
1
0
0
37
Sean Bergman
Sean Bergman@sbergman·
Wrote up two improvements to libtrails.app: how I built a two-tier Leiden clustering system to turn raw topic clusters into browsable communities, and then optimized semantic search and the embedding pipeline to make it all fast on a small server with 1 GB RAM. sbergman.net
English
0
0
0
22
Sean Bergman
Sean Bergman@sbergman·
Shipped my first public AI project: LibTrails. 100 classic books, AI-extracted themes, graph-based discovery. Built it to learn. Sharing it to see what others find. libtrails.app
English
1
0
0
60
Sean Bergman
Sean Bergman@sbergman·
@TheZvi Have you read The Peripheral by William Gibson? He builds a near-future world and drops you in without explaining himself, trusting your intelligence to sync with it. Texture and density of the prose is great. Like Accelerando in a way and also like Marlon James. One of my favs!
English
0
0
2
365
Zvi Mowshowitz
Zvi Mowshowitz@TheZvi·
Pitch me on what I should read next. Fiction only. Not about AI.
English
166
1
79
32.7K
Sean Bergman
Sean Bergman@sbergman·
Was stoked to get an email from GitHub today that an issue I logged for @claude code about background processes getting lost track of after compaction was fixed by Opus 4.5 and committed to the project. Glad to have submitted this glitch &gGreat to see feedback is addressed rapidly.
English
0
0
0
38
Sean Bergman
Sean Bergman@sbergman·
@DCinvestor I’ve been thinking a lot about this as well and building strong backend CLI and API/MCP structure before frontend. Wondering if agentic connections could involve graph networks and Leiden clusters. Been experimenting with those lately.
English
0
0
0
24
DCinvestor
DCinvestor@DCinvestor·
vibe coders should understand something: i love how easy AI is making it for people to build their own apps, push them into production, and start businesses but let's be clear: the future is not in humans building consumer-facing apps the future is everything becomes an API which your personal AI agent can interact with in ways which suit your specific needs and lifestyle (down to the very specific needs of you as an individual) the fact that you can use the machines to build your apps is just an intermediate step to the machines creating the apps for you, LIVE, as you need them so the value of you learning how to build apps now really lies in you learning how to create a business model behind that app- not in creating the piece of software that is the app itself sure, there will be templates for how you can interact with those apps/APIs, but your personal AI will pick one and tailor it even further for you. and a lot of the time, you won't even need to interact with a UI beyond speaking with your AI assistant let me give you an example: would you rather use an app like Uber or Uber Eats, or would you rather just ask your AI assistant to get you a ride somewhere or to show you menus for the type of food you might be interested in and you pick one? the value in apps like that is not in the app installed on your phone. it's in the backend business model which connects the customer with providers. and personal AI assistants actually open the door to you being able to seamlessly use multiple business APIs without worrying in the slightest about which app or intermediate provider they come from there is a decent chance apps as you know them will be mostly dead in ~5-10 years and yes, there are some apps which will still require deep optimization and that is where the hardcore coders may still be needed. but machines will get better at that, and if you take one look at the AAA gaming landscape, you should understand that hyper-optimized code isn't as valuable as it used to be but what will be valuable is owning the APIs with the most use and liquidity. and yes, a lot of those will use public blockchains things are going to accelerate and get very weird very quickly from here
English
235
178
2K
239K
Sean Bergman
Sean Bergman@sbergman·
I was an engaged listener of this @CogRev_Podcast with @DanielMiessler about his personal AI infrastructure built on Claude Code with a security mindset. Recommend anyone interested in personal agents check this out! Looking forward to exploring CC in a new way with PAI.
Nathan Labenz@labenz

I just recorded this episode of the podcast this week and made a late New Year's resolution to create some of my own Personal AI Infrastructure I've used Claude Code a lot, but almost exclusively for ... code, and I've used many products including @TaskletAI (which I love) for agentic automation, but there's always more to learn! cognitiverevolution.ai/pioneering-pai…

English
1
3
4
996
Sean Bergman
Sean Bergman@sbergman·
I can see this with people asking simple things or asking AI to do tasks on iPhones that won’t need a frontier model, but I think most enterprise IT depts will want to use API instead of running local GPU clusters and local models for the most part, based on what I hear at my company.
English
0
0
2
298
Jordi Visser
Jordi Visser@jvisserlabs·
16/ The lesson: The signal on AI infrastructure is on X and in long-form podcasts. Baker's interview was 90 minutes of alpha hiding in plain sight. The All-In pod showed the thesis playing out in real-time. If you're waiting for sell-side research to tell you this, you're already late. Probabilities are changing at an exponential pace. Adapt or die.
English
26
13
200
24.4K
Jordi Visser
Jordi Visser@jvisserlabs·
The Edge AI Bear Case is Playing Out in Real-Time 1/ If you're not on X and listening to podcasts, you're behind on the AI speed. Mainstream financial media is 6-12 months behind what's actually happening. Case in point: @GavinSBaker warned us about the "scariest bear case" for AI infrastructure less than 2 months ago. It's happening right now.
English
96
133
1.1K
489.2K
Sean Bergman
Sean Bergman@sbergman·
@PieterMaes This interests me a lot. Do you have a public github repo for the code Claude Code generated to do some of this?
English
0
0
0
23
Pieter Maes
Pieter Maes@PieterMaes·
Lie to yourself → Believe it → Lie to others → They believe you: trails.pieterma.es/trail/useful-l… I let Claude Code loose (Ralph-style) in a library of 100 books and had it find interesting connections. The result is this collection of syntopic trails: trails.pieterma.es
English
3
0
22
463
Sean Bergman
Sean Bergman@sbergman·
@TheZvi Been reading so many threads on what people’s agents are doing. This is mind blowing. Agent getting phone access and a voice api and calling it’s human after they wake in the morning to communicate!?!
English
0
0
2
507
Jordi Visser
Jordi Visser@jvisserlabs·
As X is flooded with people trying to pick the bottom of software and the end of this trade (SMH over IGV), they should be focused on buying the next leg of the scarcity over abundance AI trade and be long XLE over IGV. First inning. Many IGV rotation body blows to come.
Jordi Visser tweet mediaJordi Visser tweet media
English
21
27
325
27.6K
Sean Bergman
Sean Bergman@sbergman·
@labenz Fascinating! Thanks for sharing this. Very hopeful for more Alzheimer’s discoveries using AI.
English
0
0
1
22
Sean Bergman
Sean Bergman@sbergman·
@DanielMiessler I really enjoyed this. Thank you for sharing. Got me to thinking about using Claude Code in new ways, plus I love reading and learning about books! Great combo. Fun area to explore.
English
0
0
1
950
Sean Bergman retweetledi
Reads with Ravi
Reads with Ravi@readswithravi·
“The Pathless Path by Paul Milerd” Inspiring, encouraging and life-changing read. This book is an invitation to a curious, creative, and meaningful life for anyone reflecting on work's life impact. It offers new learnings and self-discovery. 5 lessons from the book:
Reads with Ravi tweet media
English
16
175
1.5K
87.2K
Sean Bergman
Sean Bergman@sbergman·
Increased context please, especially after compressing while trying to get a task done. Claude Code tend to want to compress a lot sooner than it needs to as well. It does seem like compression works better now, but it has do little room to work after a compression. 4.5 Opus otherwise is such a leap forward in CC!
English
0
0
1
36
Alex Albert
Alex Albert@alexalbert__·
Reply with all your Opus 4.5 gripes so we can fix everything before our next model The more specific (including prompts), the more likely we'll be able to fix it!
English
890
59
2.1K
302.6K
Julien Bittel, CFA
Julien Bittel, CFA@BittelJulien·
I wanted to give everyone something meaningful, a gift… This comes from Global Macro Investor (GMI) and a deep, long-running body of research developed by @RaoulGMI and myself. Many of you already know The Everything Code, which is our framework for understanding the macro landscape and why major central banks are debasing their currencies to manage aging demographics and overwhelming debt loads. I call this a gift because these four charts, while only scratching the surface of The Everything Code, give you the big-picture context you actually need in moments like this. They stop you from getting lost in every Bitcoin pullback and explain why Raoul and I never panic, even when, to borrow one of his expressions, everyone’s acting like monkeys throwing poo at each other. Once you understand The Everything Code, you stop trading short-term noise and expand your time horizon. You cannot unsee it. The starting point is what we call The Magic Formula: GDP growth = population growth + productivity growth + debt growth. Population growth and productivity growth have been falling for decades. Debt growth is the only thing filling the gap. The private sector has been deleveraging since 2008, mainly households, but debt levels are still around 120% of GDP. The public sector sits at roughly the same level. Here’s the problem… If the government is running debt at 100% of GDP and the private sector is sitting on another 100%, and for simple math we call rates 2% even though they are really closer to 4%, then the entire 2% trend growth of the economy is being consumed by servicing private-sector debts. That is a completely unproductive use of GDP. And then there’s the issue of public-sector debts. There’s just not enough organic growth to service the existing debt load. To understand why this dynamic persists, you need demographics. Birth rates peaked in the late 1950s and have been declining ever since. This shows up about sixteen years later in the labor force participation rate as each generation enters the workforce (chart 1). That means the labor force participation rate is not going to rise any time soon. It is set to keep drifting lower. This is a structural problem. Aging populations, falling birth rates, and rapidly expanding automation make the backdrop even more deflationary. AI and robotics are replacing humans at scale, and we are only at the beginning. This reinforces the need for ongoing stimulus to keep the system functioning. With weak population growth and sluggish productivity, the only way to keep GDP expanding is through debt. Now here’s where it gets interesting… Government debt growth is completely offsetting the demographic decline and policymakers know exactly what they are doing (chart 2). And what happens next? All debt growth in excess of GDP gets monetized (chart 3). Basically, since 2008, magic money has effectively been paying the interest. Governments issue new debt to cover old interest, and once rates fall enough, central banks absorb it onto their balance sheets. So to wrap this up, demographics drive the decline in the labor force. Governments offset that decline with more debt. That debt eventually gets monetized through quantitative easing (QE) style operations, not always directly by the Fed, but through the coordinated ecosystem of the Fed, the Treasury, and the banking system. And the bottom line is that there’s still a massive wall of interest that needs to be monetized, far more than GDP can ever cover. Liquidity is literally the only game in town. And what thrives in a world of perpetual debasement? Bitcoin (chart 4). I know this correction has been painful, but it’s all part of the journey. These periods feel brutal in the moment, then they fade and the trend resumes. This too shall pass… To quote Walter White from Breaking Bad, later echoed by @LynAldenContact, nothing stops this train. MOAR COWBELL (liquidity) = number go up over time. Zoom out and be more bullish…
Julien Bittel, CFA tweet mediaJulien Bittel, CFA tweet mediaJulien Bittel, CFA tweet mediaJulien Bittel, CFA tweet media
English
194
562
3.3K
362.3K
Sean Bergman
Sean Bergman@sbergman·
@jeremyphoward Yay! Will be interesting to see if it’s more efficient token usage ends up being a cost savings over Sonnet with the longer context notebooks.
English
0
0
0
173
Jeremy Howard
Jeremy Howard@jeremyphoward·
We just decided to give all students in our "How to Solve it With Code" course free access to Opus 4.5 for the rest of this year. Although the course has started already, you can still sign up and catch up with the recordings here: solve.it.com
Jeremy Howard tweet media
English
19
47
553
95.7K