Content Audit Workflow
Automatically audit your website content at scale
What you’ll build
A workflow that automatically audits all pages on your website, analyzing content quality, SEO elements, and providing AI-powered recommendations.
Time to build: 15 minutes
Nodes used: Web Scraper, Loop, LLM, Google Sheets
The problem
Manual content audits are time-consuming:
- Crawling hundreds or thousands of pages
- Extracting titles, metas, headings
- Analyzing content quality
- Compiling results into a spreadsheet
This workflow automates the entire process.
Workflow overview
graph LR
A[URL List] --> B[Loop]
B --> C[Web Scraper]
C --> D[LLM Analysis]
D --> E[Google Sheets]
Step-by-step guide
Step 1: Prepare your URL list
First, gather the URLs you want to audit.
Option A: Use a Google Sheet with your URLs
Option B: Scrape your sitemap
For this guide, we’ll use a Google Sheet with URLs in column A.
Step 2: Create the workflow
- Click New Workflow in the Studio
- Name it “Content Audit”
Step 3: Add the data source
Add Google Sheets Reader
Add a Google Sheets node to fetch your URL list.
Configure:
- Spreadsheet ID: Your Google Sheet ID
- Range:
Sheet1!A:A
Step 4: Set up the loop
Add a Loop node
Add a Loop node to process each URL.
Configure:
- Items:
{{GoogleSheets_0.data}} - Max iterations: 100 (or your page count)
Step 5: Scrape each page
Add Web Scraper
Inside the loop, add a Web Scraper node.
Configure:
- URL:
{{Loop_0.currentItem.url}} - Content Type: Article
The Web Scraper extracts:
- Page title
- Meta description
- All headings (H1-H6)
- Main content text
- Word count
Step 6: Analyze with AI
Add LLM node
Add an LLM node for AI analysis.
Configure:
- Model: GPT or Claude
- Instructions:
Analyze this webpage for SEO and content quality.
URL: {{$url}}
Title: {{$title}}
Meta Description: {{$metaDescription}}
Word Count: {{$wordCount}}
Content: {{$content}}
Provide:
1. SEO Score (1-10)
2. Content Quality Score (1-10)
3. Key Issues (bullet points)
4. Recommendations (bullet points)
Format as JSON:
{
"url": "",
"seo_score": 0,
"content_score": 0,
"issues": [],
"recommendations": []
} Step 7: Save results
Add Google Sheets Writer
Add a Google Sheets node to save results.
Configure:
- Spreadsheet ID: Your results spreadsheet
- Range:
Results!A:F - Operation: Append
- Values:
[[
"{{url}}",
"{{seo_score}}",
"{{content_score}}",
"{{issues}}",
"{{$recommendations}}"
]] Complete workflow
Your final workflow should look like this:
Google Sheets (URLs)
→ Loop
→ Web Scraper
→ LLM Analysis
→ Google Sheets (Results)
Sample output
| URL | SEO Score | Content Score | Issues | Recommendations |
|---|---|---|---|---|
| /blog/post-1 | 7 | 8 | Missing H2s, thin meta | Add subheadings, expand meta |
| /blog/post-2 | 9 | 6 | Content too short | Add 500+ words |
| /product/x | 5 | 7 | No schema, missing alt | Add product schema |
Customization options
Add more metrics
Extract additional data:
H1 count: {{$h1Count}}
Image count: {{$imageCount}}
Internal links: {{$internalLinks}}
External links: {{$externalLinks}}
Use competitor analysis
Add a Semrush node to compare with top-ranking pages:
graph LR
A[Web Scraper] --> C[LLM Compare]
B[Semrush Top URLs] --> C
C --> D[Results]
Check YourTextGuru scores
Add content optimization scoring:
Loop → Web Scraper → YourTextGuru Score → LLM → Sheets
Best practices
Rate limiting
When scraping many pages:
- Add delays between requests (2-3 seconds)
- Respect robots.txt
- Process in batches of 50-100
Error handling
Add a Conditional node to handle:
- 404 pages
- Timeout errors
- Empty content
If WebScraper_0.status != 200
→ Log error and continue
Else
→ Process normally
Scheduling
Set up weekly audits:
- Go to Workflow Settings
- Enable Schedule
- Set to run weekly (e.g., Monday 6 AM)
Results you can expect
| Metric | Manual | With Draft & Goal |
|---|---|---|
| Time per 100 pages | 8 hours | 15 minutes |
| Consistency | Variable | 100% |
| AI insights | None | Automatic |