What you’ll build
A workflow that automatically audits all pages on your website, analyzing content quality, SEO elements, and providing AI-powered recommendations. Time to build: 15 minutes Nodes used: Web Scraper, Loop, LLM, Google SheetsThe problem
Manual content audits are time-consuming:- Crawling hundreds or thousands of pages
- Extracting titles, metas, headings
- Analyzing content quality
- Compiling results into a spreadsheet
Workflow overview
Step-by-step guide
Step 1: Prepare your URL list
First, gather the URLs you want to audit. Option A: Use a Google Sheet with your URLs Option B: Scrape your sitemap For this guide, we’ll use a Google Sheet with URLs in column A.Step 2: Create the workflow
- Click New Workflow in the Studio
- Name it “Content Audit”
Step 3: Add the data source
1
Add Google Sheets Reader
Add a Google Sheets node to fetch your URL list.Configure:
- Spreadsheet ID: Your Google Sheet ID
- Range:
Sheet1!A:A
Step 4: Set up the loop
1
Add a Loop node
Add a Loop node to process each URL.Configure:
- Items:
{{GoogleSheets_0.data}} - Max iterations: 100 (or your page count)
Step 5: Scrape each page
1
Add Web Scraper
Inside the loop, add a Web Scraper node.Configure:
- URL:
{{Loop_0.currentItem.url}} - Content Type: Article
- Page title
- Meta description
- All headings (H1-H6)
- Main content text
- Word count
Step 6: Analyze with AI
1
Add LLM node
Add an LLM node for AI analysis.Configure:
- Model: GPT or Claude
- Instructions:
Step 7: Save results
1
Add Google Sheets Writer
Add a Google Sheets node to save results.Configure:
- Spreadsheet ID: Your results spreadsheet
- Range:
Results!A:F - Operation: Append
- Values:
Complete workflow
Your final workflow should look like this:Sample output
| URL | SEO Score | Content Score | Issues | Recommendations |
|---|---|---|---|---|
| /blog/post-1 | 7 | 8 | Missing H2s, thin meta | Add subheadings, expand meta |
| /blog/post-2 | 9 | 6 | Content too short | Add 500+ words |
| /product/x | 5 | 7 | No schema, missing alt | Add product schema |
Customization options
Add more metrics
Extract additional data:Use competitor analysis
Add a Semrush node to compare with top-ranking pages:Check YourTextGuru scores
Add content optimization scoring:Best practices
Rate limiting
When scraping many pages:- Add delays between requests (2-3 seconds)
- Respect robots.txt
- Process in batches of 50-100
Error handling
Add a Conditional node to handle:- 404 pages
- Timeout errors
- Empty content
Scheduling
Set up weekly audits:- Go to Workflow Settings
- Enable Schedule
- Set to run weekly (e.g., Monday 6 AM)
Results you can expect
| Metric | Manual | With Draft & Goal |
|---|---|---|
| Time per 100 pages | 8 hours | 15 minutes |
| Consistency | Variable | 100% |
| AI insights | None | Automatic |

