Community
Limit concurrent processes to Onify Node memory limitations
I use a wrapper to process items (often customers, but sometimes something else):
const result = await processItems({
items: myItems,
callback: functionToGetAndProcessItemsInDifferentWays,
itemIdKeyName: "id",
concurrentProcesses: 0
})
For smaller datasets concurrentProcesses can be set to 0 and without problems process all customer in parallell at blazing speed. Other times I limit it by setting it to a static value based on previous runs and results.
But as neither the amount of customers or their data are static and increases I figure it would be better to set the concurrency dynamically. For example by sending in the parameter
expectedAverageMBPerItem: 15,
with different values per callback.
Then calculate a reasonable amount of parallell processes with something like
concurrentProcesses = Math.floor((ONIFY_NODE_MAX_MEMORY_USE_MB - ONIFY_NODE_MEMORY_SAFETY_MARGIN_MB) / expectedAverageMBPerItem)
What are the Node limitations within Onify that I need to take into consideration in such calculation?
And are they per task, per flow or per Onify-instance etc?