Go to Studio

Split List

Split a list into smaller batches of a given size

What does this node do?

The Split List node splits a list into smaller batches of a given size. It returns an array of arrays, where each inner array contains up to batch_size items.

Common uses:

  • Batch API calls to respect rate limits
  • Chunk data for parallel processing
  • Paginate results for display or downstream nodes

Quick setup

Add the Split List node

Drag it onto the canvas from the Tools panel.

Connect a list input

Connect the output of any node that produces a list (e.g., Create List, Google Sheets) to the list input.

Set the batch size

Configure batch_size to control how many items each batch should contain.

Use the output

Connect the batches output to the next node in your workflow (typically a Loop node).

Configuration

Input

list json required

The list to split into batches. Accepts any valid JSON array.

Parameters

batch_size number default: 2

The number of items per batch. Minimum: 1. Maximum: 100.

The last batch may contain fewer items if the list size is not evenly divisible by the batch size.

Output

A batches value containing an array of arrays.

{
  "batches": [
    ["apple", "banana"],
    ["cherry", "date"],
    ["elderberry"]
  ]
}

Examples

Batch API calls to respect rate limits

graph LR
    A[Create List] --> B[Split List]
    B --> C[Loop]
    C --> D[API Connector]

Configuration:

  • batch_size: 10

You have a list of 100 URLs to process. Split List divides them into batches of 10, and the Loop node sends each batch to the API one at a time, staying within rate limits.

Chunk data for parallel processing

graph LR
    A[Google Sheets] --> B[Split List]
    B --> C[Loop]
    C --> D[LLM: Analyze batch]
    D --> E[Merge Lists]

Configuration:

  • batch_size: 5

A spreadsheet contains 50 rows. Split List creates 10 batches of 5 rows each. Each batch is sent to an LLM for analysis, and the results are merged back together.

Best practices

  • Choose batch size based on downstream limits. If your API allows 10 items per request, set batch_size to 10. If your LLM has a token limit, adjust the batch size to keep each batch within that limit.
  • Combine with Loop and Merge Lists. The typical pattern is: Split List to create batches, Loop to iterate over them, then Merge Lists to reassemble the results.

Common issues

Output is a single batch containing all items

Check that batch_size is smaller than the total number of items in the list. If batch_size is greater than or equal to the list length, the output will be a single batch containing the entire list.

Input is not recognized as a list

The list input must be a valid JSON array (e.g., ["a", "b", "c"]). If you pass a plain string or an object, the node will not work correctly. Make sure the upstream node outputs a proper list format.