When You Outgrow Simple Workflows
Most n8n tutorials show workflows processing one item at a time. But real production scenarios involve thousands or millions of records. Processing a CSV with 50,000 rows, syncing an entire database, or migrating data between platforms requires different techniques.
The Split In Batches Node
n8n Split In Batches node is your primary tool for large datasets. It processes items in configurable batch sizes, preventing memory issues and API rate limit violations. Set batch size to 50-100 for most API operations, or 500-1000 for database operations.
Memory Management
Large datasets can exhaust n8n memory. Strategies to manage this: process data in chunks rather than loading everything at once, remove unnecessary fields early in the workflow using the Set node, and use the Code node to handle data transformation efficiently.
Rate Limiting
Most APIs enforce rate limits. Add Wait nodes between batches or use the HTTP Request node built-in rate limiting. A 1-2 second delay between batches prevents 429 errors without significantly slowing your workflow.
Error Recovery
When processing 10,000 items, some will fail. Use the error output on batch processing nodes to catch failures, log them to a separate sheet or database, and continue processing the remaining items. After the main run completes, retry the failed items.
Database Operations
For large database migrations, use n8n database nodes with bulk operations. Insert 500 rows at a time rather than one by one. Use transactions where possible to ensure data consistency.
For more advanced techniques, see my n8n review or start with the beginner guide.