· 11 min read ·

I Built an SEO Content Machine Using Claude Code Agents

How I used Claude Code agents, context files, and slash commands to automate SEO for my Astro portfolio site: keywords, linking, and content audits.

dev ai claude seo astro

I have a portfolio site at bdigitalmedia.io that brings in videography and photography clients here in Gilbert, AZ. Two blog posts. A handful of pages. Decent content, but no SEO strategy behind it. I was writing posts when I felt like it, linking where it seemed natural, and hoping Google would figure out the rest.

That’s not a strategy. That’s a hope.

So I built an SEO content machine: a set of context files, AI agents, and slash commands inside Claude Code that systematize everything from keyword targeting to internal linking to content audits. The whole system took an afternoon to build, and now every blog post I write gets tuned for search before I even think about publishing.

Analytics dashboard showing website traffic and SEO metrics on a computer screen

The Problem With Ad-Hoc SEO

Here’s what my blog workflow looked like before: I’d open a markdown file, write about something I did (a shoot, a product launch, a new feature I built) and publish it. Sometimes I’d remember to add a meta description. Sometimes I’d throw in a few internal links. Keyword research was nonexistent. I’d Google “SEO checklist” every time, skim it, forget half of it, and ship.

The result was predictable. My Wolfbox dashcam product video post has decent content but zero keyword strategy. The football video services post is 3,000 words of useful information with ad-hoc internal linking.

Good content doesn’t matter if nobody finds it. I needed a system.

What I Built

The whole thing is inspired by TheCraigHewitt/seomachine, an open-source project that uses Claude Code context files to build SEO workflows. I adapted the concept for my specific site: a videography/photography portfolio with storefronts for video clips and photo galleries.

The system has three layers:

Context files — four markdown files in a context/ directory that encode everything Claude Code needs to know about my brand voice, target keywords, internal link structure, and on-page SEO rules.

Agents — three specialized Claude Code agents that each handle one aspect of SEO analysis: scoring, keyword mapping, and internal linking.

Commands — two slash commands (/write and /optimize) that chain the agents into complete workflows for creating new posts and auditing existing ones.

Code displayed on a dark computer screen in a development environment

Context Files: Teaching Claude My Brand

This is the foundation. Without context files, Claude writes generic SEO content. With them, it writes like me, targets my actual keywords, and links to my actual pages.

Brand Voice

I read through my existing posts and extracted the patterns into context/brand-voice.md. First person. Direct. Short paragraphs. No buzzwords. Specific examples (names, locations, gear models, dollar amounts). Em dashes everywhere. No exclamation points.

The file has explicit DO/DON’T rules. “Don’t use ‘we’” because I’m a solo operator. “Don’t use stock-photo language like ‘capturing moments’” because it’s the opposite of how I write. “State prices directly on the page” because pricing transparency is a brand differentiator.

Every post Claude writes now goes through this filter. The voice stays consistent whether I’m writing about a dashcam shoot or a football game.

Target Keywords

context/target-keywords.md organizes keywords by service category: Youth Sports Video, Event Photography, Brand Content, Drone/Aerial, Trail Running, Local/General, and Highlight Reels. Each category has primary keywords (high intent, for titles and H2s), secondary keywords (supporting, for body text), and long-tail keywords (low competition, natural in content flow).

The key insight here: my site serves multiple audiences. A parent looking for “buy football game clips” is a completely different search intent than a brand looking for “product video production Phoenix.” Organizing keywords by category means Claude can identify the right keyword set based on the post topic, not just spray generic terms everywhere.

This one changed how I think about content. context/internal-links-map.md lists every linkable page on my site with the URL, a description, and suggested anchor text variations.

The real value is in the rules: every post needs at minimum one storefront link (/clips or /pics), one contact link (/contact), and one portfolio link (/ or /work). Link on first mention only. 3-5 internal links per post is the sweet spot. Deep link to specific galleries when the context supports it.

Before this file existed, I was linking randomly. Now there’s a structure, and the agents enforce it.

SEO Guidelines

context/seo-guidelines.md is the on-page SEO rulebook, written for my Astro site. Title tag rules (50-60 characters, keyword near the front). Meta description rules (150-160 characters, primary keyword plus geographic modifier). Content structure (no H1 in body because the template handles it, 3-6 H2 sections, proper heading nesting). Keyword density targets (0.5-1.5% for the primary keyword). Schema markup rules (BlogPosting is auto-injected, add VideoObject for YouTube embeds manually).

This file is the checklist I used to Google every time, except now it’s permanent and Claude reads it automatically.

The Agents

Claude Code agents are markdown files that define a specialized task with specific tools. Each agent reads the relevant context files and produces a structured analysis.

SEO Optimizer

The seo-optimizer agent reads a blog post and all four context files, then scores the post on a 0-100 scale across ten weighted categories:

  • Title Tag (10 pts) — length, keyword presence, keyword position
  • Meta Description (10 pts) — length, keyword, geographic modifier, compelling snippet
  • Heading Structure (10 pts) — no body H1, 3-6 H2s, proper nesting, keywords in headings
  • Keyword Tuning (15 pts) — first paragraph placement, density, secondary keywords, geographic modifiers
  • Internal Links (10 pts) — minimum count, required links present, anchor text quality
  • External Links (5 pts) — count, authority of linked sources
  • Content Quality (15 pts) — word count, brand voice match, paragraph length, opening hook, closing CTA
  • Images & Media (10 pts) — OG image, alt text, lazy loading, CSS classes
  • Schema Markup (5 pts) — VideoObject for embeds, no duplicate BlogPosting
  • Tags & Slug (10 pts) — tag count, established tags, keyword-rich slug

Every issue comes with a specific line reference and a concrete fix. “Add ‘videographer Gilbert AZ’ to the opening paragraph (line 9),” not “consider adding more keywords.”

Internal Linker

The internal-linker agent inventories every link in the post, checks against the required links from the links map, scans for unlinked mentions of things that have pages, and flags over-linking. It also checks other existing posts for cross-linking opportunities, places where the new post should be linked from older content.

Keyword Mapper

The keyword-mapper agent identifies the target service category, maps every keyword occurrence, checks placement in high-value positions (title, description, first paragraph, H2s, closing), calculates density, and flags both gaps and stuffing. If a post mentions “Gilbert AZ” once but needs it 2-3 times, the agent tells me exactly where to add it.

Lines of code on a computer screen in a dark development environment

The Commands

This is where it all comes together. Two slash commands that I can run from the terminal.

/write — Full Blog Post Generator

/write drone videography for real estate does this:

  1. Reads all four context files
  2. Identifies the service category (Drone/Aerial) and picks primary, secondary, and geographic keywords
  3. Checks existing posts to avoid duplication
  4. Runs a web search if the topic needs current information
  5. Writes a full 1,500-3,000 word blog post in my brand voice with proper frontmatter, internal links, and schema markup
  6. Saves the post as a draft
  7. Runs the SEO optimizer agent. If the score is below 85, applies the priority fixes automatically
  8. Runs the internal linker agent. Applies any missing required links
  9. Reports the file path, SEO score, word count, and a checklist of remaining manual tasks (like creating the OG image)

The whole thing runs in a few minutes. I review the output, make a few voice tweaks, create the OG image, remove the draft: true flag, and publish.

/optimize — Audit Existing Posts

/optimize wolfbox-dashcam-product-video runs all three agents in parallel on an existing post, compiles a unified report with the score and prioritized fixes, then asks if I want to apply them. If I say yes, it edits the file and re-scores.

This is the one that immediately proved its value. I ran it on my existing posts and found: missing meta description keywords, no geographic modifiers, inconsistent internal linking, underused H2 keywords, and missing external authority links. All fixable in five minutes per post.

What This Changes

Before this system, I published blog posts and hoped for search traffic. Now I have a repeatable process:

  1. I pick a topic that targets a specific keyword category
  2. /write generates a draft with proper SEO mechanics baked in
  3. I review for voice and accuracy. The system handles the technical SEO, I handle the content truth
  4. /optimize catches anything the initial pass missed
  5. I create an OG image, remove the draft flag, and deploy

The context files are the real advantage. They encode decisions I’ve already made (what my brand sounds like, which keywords matter, how my site links together) so I don’t have to re-make those decisions on every post. Claude Code reads them automatically and applies them consistently.

The Technical Details

If you want to build something similar, here’s the structure:

your-site/
├── context/
│   ├── brand-voice.md          # Writing style rules
│   ├── target-keywords.md      # Keywords by category
│   ├── internal-links-map.md   # Linkable pages + anchor text
│   └── seo-guidelines.md       # On-page SEO rules
├── .claude/
│   ├── agents/
│   │   ├── seo-optimizer.md    # Scoring agent (0-100)
│   │   ├── internal-linker.md  # Cross-linking analysis
│   │   └── keyword-mapper.md   # Density + placement
│   └── commands/
│       ├── write.md            # Blog post generator
│       └── optimize.md         # SEO audit + fix

The context files are just markdown. No special format. The agents are markdown files with YAML frontmatter that specifies the agent name, description, and allowed tools (Read, Glob, Grep, and Bash for word count). The commands are similar but with a wider tool set (Write, Edit, WebSearch, and the ability to invoke agents).

Claude Code reads context files automatically when they’re in the context/ directory. Agents are invoked from commands or directly from the CLI. Commands are triggered with /<name>, like /write "topic" or /optimize slug.

The whole system is about 800 lines of markdown across 9 files. No dependencies. No build step. No API keys. Just structured instructions that Claude Code follows.

What I’d Build Next

Content calendar agent that reads target-keywords.md, checks which keyword categories have blog coverage and which don’t, and suggests topics to fill gaps. Right now I pick topics manually. This would make it systematic.

Publish-ready checker, a pre-deploy agent that verifies the OG image exists at the referenced path, checks that the slug resolves correctly in the Astro build, and confirms no draft posts are being published accidentally.

Competitor analysis to search for my target keywords, analyze what’s ranking, and identify content angles that aren’t covered. This is manual research I do occasionally but could be automated.

The Takeaway

SEO is a system problem, not a content problem. Most videographers and photographers I know write solid content about their work. What they don’t do is keyword research, internal link strategy, schema markup, or meta description tuning. It’s tedious, easy to forget, and hard to do consistently.

An AI content machine doesn’t replace the writing. It replaces the checklist. It enforces the rules you’ve already decided on. It catches the things you’d forget. And it does it the same way every time, which is exactly what SEO requires.

I wrote about building the clip storefront in an earlier post. That project is what made me realize I needed an SEO system behind it. If you’re building something similar, the seomachine project is a solid starting point. Adapt the context files to your brand, your keywords, and your site structure. The agents and commands are reusable patterns. The Astro framework’s content collections make the blog side trivial.

If you’re a videographer or photographer in the Phoenix area or anywhere else trying to figure out SEO for your portfolio site, or if you want to talk about how Claude Code fits into a creative workflow, reach out. I’m always happy to nerd out about this stuff.