Loop through each item in a large dataset
Comments
-
Hi,
Thanks for reaching out.
To efficiently process 1 million records in Decisions without overloading the system:
- Use the Run Flow For List [Batch Processing] step to process records in batches, not all at once. This enables parallel processing and avoids recursion limits.
- Avoid using a standard ForEach loop for very large lists, as Flows have a default recursion limit (20,000) and may run into performance issues.
- Test with a typical and a large data set to evaluate performance and adjust batch sizes as needed.
Best practices and details:
- Batch processing: https://documentation.decisions.com/docs/best-practices
- Recursion and looping: https://documentation.decisions.com/docs/designing-flows https://documentation.decisions.com/docs/best-practices#loopinglists:~:text=in%20the%20Flow.-,Looping/Lists,-Practice
Best Regards,
Manisha
Howdy, Stranger!
Categories
- 4.4K All Categories
- 78 General
- 13 Training
- 209 Installation / Setup
- 1.1K Flows
- 110 Rules
- 270 Administration
- 213 Portal
- 497 General Q & A
- 709 Forms
- 345 Reports
- 3 Designer Extensions
- 48 Example Flows
- 58 CSS Examples
- 1 Diagram Tile
- 7 Javascript Controls
- 186 Pages
- 5 Process Mining
- New Features
- 186 Datastructures
- 70 Repository
- 229 Integrations
- 29 Multi-Tenant
- 27 SDK
- 81 Modules
- 59 Settings
- 25 Active Directory
- 12 Version 7
- 35 Version 8
- 143 Lunch And Learn Questions