Skip to main content

What does this node do?

The Loop node processes arrays by repeating a sequence of nodes for each item. It’s essential for batch operations like scraping multiple URLs, processing spreadsheet rows, or handling API response lists. Common uses:
  • Process a list of URLs
  • Iterate through spreadsheet rows
  • Handle API response arrays
  • Batch process data items

Quick setup

1

Add the Loop node

Find it in ToolsLoop
2

Connect your data source

Connect a node that outputs an array
3

Add nodes inside the loop

These will run for each item
4

Set loop limits

Configure max iterations and delays

Configuration

Required fields

items
array
required
The array to iterate over.Examples:
  • From Sheets: {{GoogleSheets_0.data}}
  • From API: {{HTTP_0.body.results}}
  • Manual: ["url1", "url2", "url3"]

Optional fields

max_iterations
number
default:"100"
Maximum number of iterations to prevent runaway loops.Always set this as a safety measure.
delay_ms
number
default:"0"
Delay in milliseconds between iterations.Recommended:
  • API calls: 1000-2000ms
  • Web scraping: 2000-3000ms
  • No external calls: 0ms

Loop variables

Inside a loop, access these special variables:
VariableDescriptionExample
{{Loop_0.currentItem}}Current item"https://example.com"
{{Loop_0.index}}Current index (0-based)0, 1, 2
{{Loop_0.totalItems}}Total item count10

Accessing item properties

If items are objects:
// Items array
[
  {"url": "https://a.com", "name": "Site A"},
  {"url": "https://b.com", "name": "Site B"}
]
Access properties:
{{Loop_0.currentItem.url}}   → "https://a.com"
{{Loop_0.currentItem.name}}  → "Site A"

Output

After completion, the loop provides:
{
  "results": [
    // Array of results from each iteration
  ],
  "totalItems": 10,
  "processedItems": 10,
  "errors": []
}
Access results: {{Loop_0.results}}

Examples

Scrape multiple URLs

Input: List of URLs from Google Sheets Configuration:
  • Items: {{GoogleSheets_0.data}}
  • Max iterations: 50
  • Delay: 2000ms
Inside loop:
  • Web Scraper URL: {{Loop_0.currentItem.url}}

Process API results

Input: API response with array of records Configuration:
  • Items: {{HTTP_0.body.results}}

Email multiple recipients

Input: Contact list Inside loop:
  • Email to: {{Loop_0.currentItem.email}}
  • Name variable: {{Loop_0.currentItem.name}}

Nested loops

Process multi-dimensional data: Configuration:
  • Outer loop: {{categories}}
  • Inner loop: {{OuterLoop_0.currentItem.products}}
Nested loops multiply iterations. 10 categories × 50 products = 500 iterations. Set reasonable limits.

Best practices

Always set max iterations

Prevent infinite or runaway loops:
Max iterations: 100
Even if you expect fewer items, set a limit.

Add delays for external calls

When calling APIs or scraping:
Delay: 2000 ms (2 seconds)
This prevents rate limiting and server overload.

Handle errors in the loop

Add error handling for each iteration: Use Conditional nodes to check for failures.

Batch large datasets

For thousands of items:
  1. Split into batches of 50-100
  2. Process each batch in a loop
  3. Add delays between batches
  4. Track progress in a log

Use meaningful node names

Inside loops, rename nodes clearly:
❌ WebScraper_0
✅ ScrapeCurrentURL

❌ LLM_0
✅ AnalyzeContent

Common patterns

Collect and aggregate

Loop through items
  → Process each
  → Add to results array

After loop:
  → Aggregate results
  → Generate summary

Filter while processing

Loop through items
  → Check condition
  → If matches, process
  → If not, skip

Progress tracking

Loop through items
  → Process
  → Log progress: "Processed {{Loop_0.index + 1}} of {{Loop_0.totalItems}}"

Common issues

  • Reduce max iterations
  • Add shorter delays
  • Process in smaller batches
  • Check for infinite loop conditions
  • Increase delay between iterations
  • Reduce batch size
  • Implement exponential backoff
  • Check API rate limits
  • Add Conditional node to handle errors
  • Log failed items for review
  • Use retry logic for transient failures
  • Ensure nodes inside loop produce output
  • Check that results are being collected
  • Verify input array has items