In Nected, workflows are flexible and support multiple patterns for executing REST APIs concurrently. Whether you’re making API calls to fetch data from external sources, trigger third-party services, or synchronize platforms, executing these calls in parallel significantly improves performance and reduces latency.
There are three supported approaches to achieve parallel API execution within a workflow:
1. Using a Single Rule (Recommended for API-only Flows)
This method is ideal when you need to call multiple APIs together and process their responses jointly.
Steps to Execute APIs in Parallel Using a Single Rule:
Navigate to Rule Editor:
Open an existing rule or create a new Simple Rule.
Configure Input Attributes:
Add each REST API you want to call as an input attribute. These will act as parallel API sources.
For each API, configure the input as an API fetcher that calls the respective endpoint.
Define Output (Result):
In the 'Result' section, reference all the API response values you configured in input attributes.
You can combine them into one object or return them individually.
Use the Rule in Workflow:
Go to your workflow.
Add a Rule Node and select the configured rule.
All APIs configured in the input attributes will now be triggered in parallel when the workflow reaches this node.
Input attributes are evaluated in parallel by default in Nected, so this setup ensures that all APIs run concurrently.
When this method is useful?
The Single Rule method works best when you have a fixed set of independent APIs that need to execute simultaneously without complex processing logic between them. This approach excels in scenarios where you want to aggregate data from multiple sources quickly and efficiently, treating each API as a simple data fetcher rather than requiring sophisticated business logic or conditional processing for individual calls.
Real Estate Property Details Aggregation
Imagine a real estate platform displaying comprehensive property information. When a user views a property listing, the system needs to simultaneously fetch property details from the MLS database API, neighborhood crime statistics from the local government API, school ratings from the education department API, and recent comparable sales from the market analysis API. Since these are all independent data sources that collectively create the complete property profile, the Single Rule method provides the most straightforward implementation. You configure each API as an input attribute and combine all responses into a unified property information object, delivering a complete view to users with minimal latency.
E-commerce Product Recommendation Engine
Consider an e-commerce product page that displays personalized recommendations alongside product details. The system must fetch the main product information from the catalog API, user browsing history from the analytics API, inventory levels from the warehouse API, and dynamic pricing from the pricing engine API. These APIs operate independently but their combined data creates the personalized shopping experience. The Single Rule method handles this perfectly because there's no complex decision-making between API calls - you simply need all the data to render the page effectively.
Customer Support Dashboard Data Compilation
A customer support representative needs instant access to comprehensive customer information when handling inquiries. The dashboard pulls customer profile data from the CRM API, recent order history from the order management API, support ticket history from the helpdesk API, and account status from the billing API. Since support representatives need this complete picture immediately, and none of these APIs depend on each other's responses, the Single Rule method ensures all information loads concurrently, providing the fastest possible response time for critical customer service scenarios.
Financial Portfolio Overview Display
Investment platforms often need to present users with a complete financial snapshot that includes current account balances from banking APIs, portfolio performance from investment management APIs, market news from financial data providers, and regulatory updates from compliance systems. These data sources are entirely independent, yet together they form the comprehensive view investors need to make informed decisions. The Single Rule method eliminates unnecessary complexity while ensuring all critical financial information loads simultaneously.
2. Using a RuleSet (Recommended When Each API Has Its Own Logic)
This method is best suited when each API requires independent pre-processing or post-processing logic.
Steps to Use RuleSet for Parallel API Execution:
Create Individual Rules:
Create a separate rule for each API call.
Each rule should handle just one API (and optionally its transformation).
Create a RuleSet:
Navigate to Rule Editor and choose RuleSet.
Add all the rules you created.
The RuleSet will now invoke all the selected rules in parallel.
Use the RuleSet in Workflow:
In your workflow, add a Rule Node.
Choose the RuleSet from the dropdown.
On execution, all rules inside the RuleSet—and hence all APIs—will be called in parallel.
The collective output of all rules in the RuleSet will be available in a single response object.
When this method is useful?
The RuleSet method becomes essential when your parallel API calls require different business logic, validation rules, or data transformation processes for each endpoint. Unlike the Single Rule approach where APIs act as simple data fetchers, RuleSet excels when each API call needs its own intelligent processing layer. This method provides modular separation of concerns, allowing you to maintain, version, and modify the logic for each API independently while still executing them concurrently.
Multi-Platform Social Media Content Distribution
Consider a social media management platform that publishes content across Facebook, Twitter, LinkedIn, and Instagram simultaneously. Each platform has unique content requirements that demand specialized processing logic. Facebook posts might need hashtag optimization and link preview formatting, Twitter requires character count validation and thread splitting for longer content, LinkedIn demands professional tone analysis and company mention processing, while Instagram needs image resizing and caption enhancement. Each platform's rule contains its own validation logic, content transformation algorithms, and error handling procedures. The RuleSet method allows you to maintain separate, focused rules for each platform while publishing content across all channels concurrently, ensuring that platform-specific requirements don't interfere with each other.
Customer Identity Verification Across Multiple Services
Financial institutions often need to verify customer identities through various specialized services during account opening. The Aadhaar verification rule might include OCR processing and document validation logic, the PAN verification rule could contain income assessment algorithms and tax validation checks, the credit bureau rule might implement risk scoring calculations and historical analysis, while the sanctions screening rule requires complex name matching algorithms and watchlist processing. Each verification service demands its own sophisticated business logic, data formatting requirements, and compliance checks. Using RuleSet allows each verification type to maintain its specialized processing logic while running all verifications concurrently, ensuring comprehensive identity validation without sacrificing the unique requirements of each service.
E-commerce Inventory Synchronization Across Multiple Warehouses
Large e-commerce platforms must synchronize inventory levels across different warehouse management systems that each have unique data formats and business rules. The main warehouse rule might include bulk inventory processing and automated reorder logic, the dropshipper rule could contain supplier reliability scoring and lead time calculations, the third-party logistics rule might implement capacity planning and shipping optimization algorithms, while the retail store rule requires real-time stock allocation and reservation processing. Each warehouse system operates with different data structures, update frequencies, and business constraints. RuleSet enables each warehouse integration to maintain its specific processing logic while updating inventory levels across all systems simultaneously, ensuring accurate stock levels without compromising the unique operational requirements of each fulfillment channel.
Insurance Claim Processing Across Different Policy Types
Insurance companies process claims through multiple specialized systems depending on policy types, each requiring distinct evaluation criteria and processing workflows. The auto insurance rule might include damage assessment algorithms and repair cost estimation logic, the health insurance rule could contain medical code validation and provider network verification, the property insurance rule might implement loss adjustment calculations and coverage limitation checks, while the life insurance rule requires beneficiary validation and actuarial assessment processing. Each claim type follows different regulatory requirements, evaluation methodologies, and approval workflows. The RuleSet approach allows each policy type to maintain its specialized claim processing logic while evaluating multiple claims concurrently, ensuring that industry-specific requirements and compliance standards are met without creating conflicts between different claim processing methodologies.
3. Using Loop Node + Switch Block (Advanced Configuration)
This method is useful when you need to conditionally trigger different APIs in parallel based on dynamic inputs.
Steps to Use Loop + Switch for Parallel API Calls:
Create a Helper Workflow with Switch Block:
Open Workflow Editor.
Add a Switch Block node.
Create a case for each API call (e.g., case = "PARALLEL_1", case = "PARALLEL_2").
Inside each case, add a REST API Node to call the respective API.
Publish this helper workflow.
Use Loop Node in Main Workflow:
In your primary workflow, add a Loop Node.
Set the loop input as a list of case values (e.g., ["PARALLEL_1", "PARALLEL_2", "PARALLEL_3"]).
Set concurrency equal to the number of parallel APIs you wish to run.
Invoke Switch Workflow via Loop:
Inside the loop, add a Workflow Node.
Select the helper workflow with the Switch Block.
Pass the case input to the switch logic using a mapped variable (e.g., apiName).
The Loop Node will concurrently run multiple instances of the Switch workflow, executing all matched APIs in parallel.
When this method is useful?
The Loop Node + Switch Block method becomes invaluable when your parallel API execution requirements depend on dynamic conditions, variable inputs, or complex decision-making logic that determines which APIs should be called and how many times. This advanced approach excels in scenarios where the execution pattern isn't fixed but instead adapts based on runtime data, user attributes, or business rules. Unlike the previous methods that work with predetermined API sets, this method provides sophisticated control over parallel execution flow, making it ideal for applications that need intelligent, conditional API orchestration.
Dynamic Partner Integration Based on User Location and Preferences
Consider a travel booking platform that connects users with various service providers based on their specific location, travel dates, and preferences. When a user searches for accommodations in a particular city, the system needs to dynamically determine which partner APIs to call based on multiple factors. If the user is searching in Europe, the system might call Booking.com, Expedia, and local European hotel chains, but if they're searching in Asia, it would call Agoda, local Asian platforms, and regional bed-and-breakfast networks. Additionally, business travelers might trigger calls to corporate housing APIs, while leisure travelers activate vacation rental and boutique hotel APIs. The Loop + Switch method handles this complexity perfectly because it can evaluate the user's profile, destination, and preferences to create a dynamic list of relevant partner APIs, then execute them all in parallel while maintaining the flexibility to add or remove partners based on real-time business rules and availability.
Conditional Data Enrichment Based on Customer Tier and Product Interest
Enterprise software platforms often need to enrich customer data differently depending on the customer's subscription tier and specific product interests. For premium enterprise customers showing interest in advanced analytics features, the system might simultaneously call external data providers like Salesforce, HubSpot integration APIs, market intelligence services, and industry-specific data enrichment platforms. However, for mid-tier customers interested in basic CRM functionality, it would only call essential contact enrichment APIs and basic company information services. Startup customers might trigger calls to free public APIs and basic validation services only. The Loop + Switch approach allows the system to dynamically build a list of appropriate enrichment APIs based on customer attributes, then execute all relevant calls in parallel, ensuring that each customer receives the appropriate level of data enrichment without wasting resources on unnecessary API calls or overwhelming lower-tier customers with premium features they cannot access.
Multi-Tenant API Routing Based on Organization Configuration
Software-as-a-Service platforms serving multiple enterprise clients often need to route API calls to different backend services based on each organization's specific configuration and integration requirements. When processing a business workflow for Organization A, the system might need to call their custom Salesforce instance, their specific accounting software API, their preferred notification service, and their industry-specific compliance checking APIs. Organization B might require calls to their Microsoft Dynamics integration, their custom inventory management system, their specialized reporting APIs, and their unique audit trail services. Each organization's configuration dictates not only which APIs to call but also how many and in what combination. The Loop + Switch method excels here because it can dynamically read each organization's configuration, build an appropriate list of API calls, and execute them all in parallel while maintaining the flexibility to accommodate new integrations and changing business requirements without modifying the core workflow logic.
Event-Driven Notification Distribution with Dynamic Channel Selection
Modern communication platforms need to deliver notifications across multiple channels based on user preferences, message urgency, and content type. When a critical security alert needs to be sent, the system might determine that a particular user should receive notifications via email, SMS, Slack, and push notifications simultaneously. However, for routine updates, the same user might only receive notifications through email and in-app messaging. Business users might have different notification channels configured for different types of content - financial alerts through SMS and email, system updates through Slack and Teams, while marketing communications go through email and push notifications only. The Loop + Switch method provides the perfect framework for this dynamic notification routing because it can evaluate the message type, user preferences, and urgency level to create a customized list of notification channels, then execute all selected delivery APIs in parallel, ensuring that users receive timely communications through their preferred channels without being overwhelmed by unnecessary notifications.
Parallel Branch node is coming soon.
Summary
Parallel API execution in Nected workflows dramatically improves performance by eliminating sequential wait times and optimizing resource utilization. Whether you're aggregating data from multiple sources, processing independent business logic, or implementing dynamic conditional routing, choosing the right parallelization method ensures optimal workflow efficiency.
Here is a brief table of the 3 methods and what they're mostly suitable for:
Method
Suitable For
Configuration Complexity
Parallelism Scope
Single Rule
Simple API fan-out scenarios
Low
All APIs as input attrs
RuleSet
Independent logic or pre-processing for each API
Medium
Rules executed in parallel
Loop + Switch Block
Dynamic execution based on conditions or arrays
High
Workflows run in parallel
Each method provides significant performance gains over sequential execution while maintaining clean, maintainable workflow architecture. Start with the simplest approach that meets your requirements, and scale to more sophisticated patterns as your integration complexity grows.