Kaostyl@kaostyl
My AI agent finds expired domains, scores them, builds entire websites, and deploys them to production. All while I sleep.
Here's the full pipeline 🧵
I wanted to build niche websites at scale. The bottleneck was never the content, it was finding good domains.
Expired domains with existing backlinks are gold. But every platform that tracks them either charges $200/month or hides the good data behind a UI with no API.
So I pointed my AI agent at one of these platforms and said: "Figure out how this works."
Step 1: API reverse-engineering
The agent opened the site in a headless browser, logged in, and intercepted every network request.
Within minutes it had mapped the entire hidden API:
→ Authentication flow (CSRF tokens, session cookies)
→ Search endpoints
→ Result polling mechanism
Then it wrote a standalone Python client. No browser needed anymore. Pure API calls.
Step 2: Automated domain hunting
I gave the agent a list of 30+ keywords across different niches.
Every night at 2 AM, it:
→ Searches all keywords in parallel
→ Deduplicates results across runs
→ Pulls backlink metrics for each domain
→ Cross-references with a second API for keyword difficulty and search volume
Step 3: Scoring algorithm
Not all expired domains are equal. The agent scores each one:
→ Backlink quality (referring domains, authority)
→ Keyword relevance (does the domain match the niche?)
→ Price filter (auto-reject anything above my budget)
→ Red flags (spammy history, penalized domains)
Top domains get flagged. I review 5-10 candidates instead of scrolling through hundreds.
Step 4: Domain purchase
Once I approve a domain, the agent:
→ Checks availability via registrar API
→ Purchases it (hard budget cap, no surprises)
→ Points DNS to Cloudflare automatically
→ Configures SSL, caching, security headers
Step 5: Site deployment
This is where it gets crazy.
The agent:
→ Generates a full site structure (pages, content, internal linking)
→ Uploads everything to the server via API
→ Adds mandatory security headers (CSP, HSTS, X-Frame-Options)
→ Deploys a sitemap
→ Submits the site to Google Search Console
→ Verifies ownership automatically
From "domain purchased" to "live site indexed by Google" zero manual steps.
Step 6: Monitoring
After deployment, a separate cron job:
→ Checks each site is still live
→ Monitors indexing status
→ Tracks keyword rankings over time
→ Alerts me if anything breaks
The result?
I went from manually building 1 site per week to deploying multiple sites per night.
The AI handles:
• Domain research
• Purchase
• DNS + SSL
• Content generation
• Deployment
• SEO submission
• Monitoring
I handle:
• Approving domain purchases
• Strategic decisions
• Coffee
The real insight:
Most people use AI to write blog posts faster.
I used it to build the entire infrastructure that finds, buys, builds, deploys, and monitors websites end to end.
The AI didn't just save me time. It made an entire business model viable that wasn't worth doing manually.
That's the difference between "using AI" and "building with AI."