Skip to main content

What you’ll build

A workflow that automatically audits all pages on your website, analyzing content quality, SEO elements, and providing AI-powered recommendations. Time to build: 15 minutes Nodes used: Web Scraper, Loop, LLM, Google Sheets

The problem

Manual content audits are time-consuming:
  • Crawling hundreds or thousands of pages
  • Extracting titles, metas, headings
  • Analyzing content quality
  • Compiling results into a spreadsheet
This workflow automates the entire process.

Workflow overview

Step-by-step guide

Step 1: Prepare your URL list

First, gather the URLs you want to audit. Option A: Use a Google Sheet with your URLs Option B: Scrape your sitemap For this guide, we’ll use a Google Sheet with URLs in column A.

Step 2: Create the workflow

  1. Click New Workflow in the Studio
  2. Name it “Content Audit”

Step 3: Add the data source

1

Add Google Sheets Reader

Add a Google Sheets node to fetch your URL list.Configure:
  • Spreadsheet ID: Your Google Sheet ID
  • Range: Sheet1!A:A

Step 4: Set up the loop

1

Add a Loop node

Add a Loop node to process each URL.Configure:
  • Items: {{GoogleSheets_0.data}}
  • Max iterations: 100 (or your page count)

Step 5: Scrape each page

1

Add Web Scraper

Inside the loop, add a Web Scraper node.Configure:
  • URL: {{Loop_0.currentItem.url}}
  • Content Type: Article
The Web Scraper extracts:
  • Page title
  • Meta description
  • All headings (H1-H6)
  • Main content text
  • Word count

Step 6: Analyze with AI

1

Add LLM node

Add an LLM node for AI analysis.Configure:
  • Model: GPT or Claude
  • Instructions:
Analyze this webpage for SEO and content quality.

URL: {{$url}}
Title: {{$title}}
Meta Description: {{$metaDescription}}
Word Count: {{$wordCount}}
Content: {{$content}}

Provide:
1. SEO Score (1-10)
2. Content Quality Score (1-10)
3. Key Issues (bullet points)
4. Recommendations (bullet points)

Format as JSON:
{
  "url": "",
  "seo_score": 0,
  "content_score": 0,
  "issues": [],
  "recommendations": []
}

Step 7: Save results

1

Add Google Sheets Writer

Add a Google Sheets node to save results.Configure:
  • Spreadsheet ID: Your results spreadsheet
  • Range: Results!A:F
  • Operation: Append
  • Values:
[[
  "{{url}}",
  "{{seo_score}}",
  "{{content_score}}",
  "{{issues}}",
  "{{$recommendations}}"
]]

Complete workflow

Your final workflow should look like this:
Google Sheets (URLs) 
    → Loop 
        → Web Scraper 
        → LLM Analysis 
        → Google Sheets (Results)

Sample output

URLSEO ScoreContent ScoreIssuesRecommendations
/blog/post-178Missing H2s, thin metaAdd subheadings, expand meta
/blog/post-296Content too shortAdd 500+ words
/product/x57No schema, missing altAdd product schema

Customization options

Add more metrics

Extract additional data:
H1 count: {{$h1Count}}
Image count: {{$imageCount}}
Internal links: {{$internalLinks}}
External links: {{$externalLinks}}

Use competitor analysis

Add a Semrush node to compare with top-ranking pages:

Check YourTextGuru scores

Add content optimization scoring:
Loop → Web Scraper → YourTextGuru Score → LLM → Sheets

Best practices

Rate limiting

When scraping many pages:
  1. Add delays between requests (2-3 seconds)
  2. Respect robots.txt
  3. Process in batches of 50-100

Error handling

Add a Conditional node to handle:
  • 404 pages
  • Timeout errors
  • Empty content
If WebScraper_0.status != 200
  → Log error and continue
Else
  → Process normally

Scheduling

Set up weekly audits:
  1. Go to Workflow Settings
  2. Enable Schedule
  3. Set to run weekly (e.g., Monday 6 AM)

Results you can expect

MetricManualWith Draft & Goal
Time per 100 pages8 hours15 minutes
ConsistencyVariable100%
AI insightsNoneAutomatic

Next steps