JWack

19.8K posts

JWack banner
JWack

JWack

@JaredWackerly

@DynastyNerds Co-Owner | 💼 Sales Manager Wesco Distribution 🚚 | Dad to 3 👧👧👦 | Former D1 QB 🏈

Ohio, USA Katılım Eylül 2018
1.3K Takip Edilen10.8K Takipçiler
JWack retweetledi
RCsWorld
RCsWorld@RCsWrld·
These 2 just went on a 10-0 run in a playoff game
RCsWorld tweet media
English
140
9.7K
106.9K
2.2M
Joe Bryant
Joe Bryant@Football_Guys·
We all walk with a limp. Loss and heartbreak suck. Some thoughts. Today would have been my little brother Rich's birthday. I have posted this in years past and have found it's helpful to some. And to be clear, I'm OK. I'm not special or unique in this. This isn't a "signal," and I don't mean to be dramatic. But it's also real, and it felt significant. I've said something like this on this day in past years, and I'm saying it again. And to be clear, this is not about me or my brother. This is maybe about you and overcoming loss or heartbreak. One of the challenging things about this day was calling my Dad in the morning. He'd be expecting my call and he'd usually answer with a crack in his voice, saying, "Hey, Joe... And then maybe he'd be able to get out something like, "It's difficult, isn't it?..." I usually was fine until I heard his voice, and then I wouldn't be able to say much besides, "Yeah, Dad. Yeah, it is." And then we'd hang up. With my dad now gone, this is the third year on Rich's birthday where I haven't been able to make that call. But my feeling of loss is offset today by my faith that I trust they are reunited now. When my Dad passed, I remember thinking his broken heart, that was never fully healed since his son died, would finally be whole again. And I'm happy for that. I'm reminded of words from one of my favorite writers, Anne Lamott: "You will lose someone you can't live without, and your heart will be badly broken, and the bad news is that you never completely get over the loss of your beloved. But this is also the good news. They live forever in your broken heart that doesn't seal back up. "And you come through. It's like having a broken leg that never heals perfectly – that still hurts when the weather gets cold, but you learn to dance with the limp." - Anne Lamott I don't know how much I've learned. I am sure I have a limp. Most of us do. Again, I'm not special or unique in this. I don't mean to be dramatic here and I'm OK, but this is on my mind this morning. And especially on my mind is the folks out there who read this who are not as far down the road past a loss. You may, in fact, not be fine today. Maybe you're not yet to the point where the broken heart heals enough to just be a limp. And you're not ready to dance. If that's you, I hope and pray you get there. I hope and pray you get to that next version of yourself that can dance as best you can with a limp. Peace and Grace to you.
Joe Bryant tweet media
English
15
5
149
12.6K
JWack
JWack@JaredWackerly·
Microsoft is the most confusing ecosystem and yet most large companies are all embedded in it.
English
0
0
2
972
JWack
JWack@JaredWackerly·
@bcherny can we get a token reset to explore
English
0
0
0
92
Boris Cherny
Boris Cherny@bcherny·
Opus 4.7 is in Claude Code today. It's more agentic, more precise, and a lot better at long-running work. It carries context across sessions and handles ambiguity much better.
Claude@claudeai

Introducing Claude Opus 4.7, our most capable Opus model yet. It handles long-running tasks with more rigor, follows instructions more precisely, and verifies its own outputs before reporting back. You can hand off your hardest work with less supervision.

English
382
182
3.1K
231K
JWack
JWack@JaredWackerly·
@claudeai can we get an early token reset?
English
0
0
0
65
Claude
Claude@claudeai·
Introducing Claude Opus 4.7, our most capable Opus model yet. It handles long-running tasks with more rigor, follows instructions more precisely, and verifies its own outputs before reporting back. You can hand off your hardest work with less supervision.
Claude tweet media
English
4.8K
10.3K
81.3K
13.7M
JWack
JWack@JaredWackerly·
@dkare1009 Exactly. It’s complicated for 98% of people. Microsoft is the most disjointed ecosystem.
English
0
0
0
55
Dhairya
Dhairya@dkare1009·
Microsoft is quietly taking over the Enterprise AI Agent stack. And most people have only seen ~10% of it. Everyone talks about Copilot. Some know Azure. A few use GitHub Copilot. But underneath... Microsoft has built a full-stack AI ecosystem —from models → to agents → to governance. Here’s the full breakdown 👇 📌 1. Models (the brain) Azure GPT-5.1, Phi-4, MAI-1, KOSMOS-2, Florence 2, MAI-Voice This is the intelligence layer powering everything. 📌 2. Frameworks (the builder layer) Semantic Kernel, AutoGen, Task Weaver, Agent Framework These are what let you actually build AI agents. 📌 3. Responsible AI (the guardrails) Azure AI Content Safety, Purview, Defender, Entra Security + governance baked in from day one. 📌 4. Productivity (the distribution) Excel, Teams, Outlook, PowerPoint AI is not a feature. It’s embedded in daily workflows. 📌 5. Image & Video (creative layer) Designer, Clipchamp, Copilot Image Content creation → fully inside the ecosystem. 📌 6. Coding (developer layer) GitHub Copilot, VS Code, Azure AI Toolkit From writing code → to deploying → AI is everywhere. 📌 7. AI Agents Microsoft Copilot, SharePoint Knowledge Agents, Copilot Studio, Dynamics 365, Power Platform, Viva Learning Agent, Edge Copilot, Security Copilot , The autonomous layer that ties the entire ecosystem together And this is just the outer surface of the Microsoft Core offering. If we start to dive deeper into Azure AI, the layer goes even deeper. This just shows Microsoft's commitment on helping enterprises adopt agentic AI. Not only do they make it very easy with no-code tools like Power Platform, but also allows you to customize it and build custom agents using their agent frameworks and tools. Save 💾 ➞ React 👍 ➞ Share ♻️
GIF
English
7
50
274
16.3K
Andrej Karpathy
Andrej Karpathy@karpathy·
LLM Knowledge Bases Something I'm finding very useful recently: using LLMs to build personal knowledge bases for various topics of research interest. In this way, a large fraction of my recent token throughput is going less into manipulating code, and more into manipulating knowledge (stored as markdown and images). The latest LLMs are quite good at it. So: Data ingest: I index source documents (articles, papers, repos, datasets, images, etc.) into a raw/ directory, then I use an LLM to incrementally "compile" a wiki, which is just a collection of .md files in a directory structure. The wiki includes summaries of all the data in raw/, backlinks, and then it categorizes data into concepts, writes articles for them, and links them all. To convert web articles into .md files I like to use the Obsidian Web Clipper extension, and then I also use a hotkey to download all the related images to local so that my LLM can easily reference them. IDE: I use Obsidian as the IDE "frontend" where I can view the raw data, the the compiled wiki, and the derived visualizations. Important to note that the LLM writes and maintains all of the data of the wiki, I rarely touch it directly. I've played with a few Obsidian plugins to render and view data in other ways (e.g. Marp for slides). Q&A: Where things get interesting is that once your wiki is big enough (e.g. mine on some recent research is ~100 articles and ~400K words), you can ask your LLM agent all kinds of complex questions against the wiki, and it will go off, research the answers, etc. I thought I had to reach for fancy RAG, but the LLM has been pretty good about auto-maintaining index files and brief summaries of all the documents and it reads all the important related data fairly easily at this ~small scale. Output: Instead of getting answers in text/terminal, I like to have it render markdown files for me, or slide shows (Marp format), or matplotlib images, all of which I then view again in Obsidian. You can imagine many other visual output formats depending on the query. Often, I end up "filing" the outputs back into the wiki to enhance it for further queries. So my own explorations and queries always "add up" in the knowledge base. Linting: I've run some LLM "health checks" over the wiki to e.g. find inconsistent data, impute missing data (with web searchers), find interesting connections for new article candidates, etc., to incrementally clean up the wiki and enhance its overall data integrity. The LLMs are quite good at suggesting further questions to ask and look into. Extra tools: I find myself developing additional tools to process the data, e.g. I vibe coded a small and naive search engine over the wiki, which I both use directly (in a web ui), but more often I want to hand it off to an LLM via CLI as a tool for larger queries. Further explorations: As the repo grows, the natural desire is to also think about synthetic data generation + finetuning to have your LLM "know" the data in its weights instead of just context windows. TLDR: raw data from a given number of sources is collected, then compiled by an LLM into a .md wiki, then operated on by various CLIs by the LLM to do Q&A and to incrementally enhance the wiki, and all of it viewable in Obsidian. You rarely ever write or edit the wiki manually, it's the domain of the LLM. I think there is room here for an incredible new product instead of a hacky collection of scripts.
English
2.8K
6.9K
57.7K
20.7M
JWack retweetledi
Casual Sports Fan
Casual Sports Fan@bigsportscasual·
Lebron was really just a kid terrorizing the NBA
English
139
1.7K
13.8K
1.1M
JWack retweetledi
That Guy Rocked
That Guy Rocked@NBAGuyRocked·
Zydrunas Ilgauskas That guy rocked.
English
137
850
7.8K
937.9K
Robert Griffin III
Robert Griffin III@RGIII·
Our kids wanted to watch sports today and our 8 year looks at me and says, “Daddy, I want to watch you.” 🥹 They proceeded to watch an hour straight of Daddy’s highlights on YouTube. It was a sweet moment that just might be the kick in the butt Daddy needed to make a comeback.
English
111
36
971
107.2K
JL Garofalo
JL Garofalo@JL_Garofalo·
Some JL news: I’m no longer in my CTO role at @bdge__. No drama here — AI is changing how we can build companies, and I’m taking that opportunity to get back to building my own brands and products, starting with the new @FrontYardFYF Jobs Board. Grateful to @nickercolano & the @bdge__ crew for a good time. We built some cool products together that made the business money.
JL Garofalo tweet media
English
9
6
72
11.8K
JWack retweetledi
Charles Lamanna
Charles Lamanna@clamanna·
Great to see the excitement around Copilot Cowork today. I have been using it in my own work for the past few weeks, and the best way to understand it is to see it in action. Sharing a short demo from my day to day here.
English
72
121
976
178.1K
JWack retweetledi
John Ziegler
John Ziegler@Zigmanfreud·
If you are longing for a simpler time that is now gone forever, this video will likely hit HARD… 🥲
English
1.1K
13.3K
41.1K
5.7M
Ryan McDowell
Ryan McDowell@RyanMc23·
Just officially notified my school district that I am retiring.
GIF
English
73
1
447
20.9K
JWack retweetledi
JL Garofalo
JL Garofalo@JL_Garofalo·
If you work (or dreamed of working) in fantasy sports, betting, DFS, or sports tech-- I made this to help you find opportunities across the industry. It's a curated job board, updated daily with new jobs & opportunities to shoot your shot on. FYF Jobs: jobs.frontyardfantasy.com
JL Garofalo tweet media
English
18
44
444
79.5K
JWack
JWack@JaredWackerly·
@RayGQue We used to go ham on no xplode before games and at halftime lmao
English
0
0
0
507