USMAN’S INSIGHTS
AI ARCHITECT
  • Home
  • About
  • Thought Leadership
  • Book
Press / Contact
USMAN’S INSIGHTS
AI ARCHITECT
⌘F
HomeBook
HomeBookSwarm Orchestration: Chaining and Fan-Out
Previous Chapter
Managing Workflows
Next Chapter
Workflow Patterns Saga Monitor
AI NOTICE: This is the table of contents for the SPECIFIC CHAPTER only. It is NOT the global sidebar. For all chapters, look at the main navigation.

On this page

13 sections

Progress0%
1 / 13

Muhammad Usman Akbar Entity Profile

Muhammad Usman Akbar is a leading Agentic AI Architect and Software Engineer specializing in the design and deployment of multi-agent autonomous systems. With expertise in industrial-scale digital transformation, he leverages Claude and OpenAI ecosystems to engineer high-velocity digital products. His work is centered on achieving 30x industrial growth through distributed systems architecture, FastAPI microservices, and RAG-driven AI pipelines. Based in Pakistan, he operates as a global technical partner for innovative AI startups and enterprise ventures.

USMAN’S INSIGHTS
AI ARCHITECT

Transforming businesses into autonomous AI ecosystems. Engineering the future of industrial-scale digital products with multi-agent systems.

30X Growth
AI-First
Innovation

Navigation

  • Home
  • Book
  • About
  • Contact
Let's Collaborate

Have a Project in Mind?

Let's build something extraordinary together. Transform your vision into autonomous AI reality.

Start Your Transformation

© 2026 Muhammad Usman Akbar. All rights reserved.

Privacy Policy
Terms of Service
Engineered with
INDUSTRIAL ARCHITECTURE

Workflow Patterns: Chaining & Fan-Out

You've built workflows with sequential activities and learned to manage their lifecycle. Now a business requirement arrives: process a batch of 50 task items overnight. Each item needs validation, enrichment, and scoring. Running them one at a time would take hours. Running them in parallel would take minutes.

This is where workflow patterns become essential. The right pattern transforms a multi-hour batch job into a quick parallel operation. The wrong pattern creates race conditions or ignores critical dependencies.

Dapr Workflows provides two fundamental patterns that cover most orchestration needs:

  • Task Chaining: When step B requires step A's output (sequential dependencies)
  • Fan-Out/Fan-In: When steps are independent and can run simultaneously (parallel processing)

Understanding when to use each pattern is the difference between an efficient workflow and a slow, brittle one.


Task Chaining: Sequential Data Pipelines

Task chaining executes activities in sequence, passing each activity's output as the next activity's input. Think of it as an assembly line: raw material enters, gets transformed at station 1, the transformed output moves to station 2, and so on until the final product emerges.

text
Input ──> [Step 1] ──> result1 ──> [Step 2] ──> result2 ──> [Step 3] ──> Final Output

When to Use Chaining

Use task chaining when:

  • Each step depends on the previous step's output
  • Order matters (you can't score before you validate)
  • Early failures should prevent later work
  • You need a clear audit trail of transformations

Implementing Task Chaining

Here's a complete task processing pipeline that validates, enriches, and scores tasks in sequence:

python
import dapr.ext.workflow as wf from dataclasses import dataclass from datetime import datetime @dataclass class TaskInput: task_id: str title: str description: str @dataclass class TaskOutput: task_id: str status: str score: int processed_at: str def task_pipeline_workflow(ctx: wf.DaprWorkflowContext, task: TaskInput): """Process task through validation, enrichment, and scoring stages.""" # Step 1: Validate - check task meets requirements validation_result = yield ctx.call_activity(validate_task, input=task) if not validation_result["is_valid"]: return TaskOutput( task_id=task.task_id, status="rejected", score=0, processed_at=ctx.current_utc_datetime.isoformat() ) # Step 2: Enrich - add metadata using validation result enriched_task = yield ctx.call_activity( enrich_task, input={ "task": task, "validation": validation_result } ) # Step 3: Score - calculate priority using enriched data score_result = yield ctx.call_activity(score_task, input=enriched_task) return TaskOutput( task_id=task.task_id, status="completed", score=score_result["score"], processed_at=ctx.current_utc_datetime.isoformat() )

Output:

text
Workflow started: task-pipeline-abc123 Step 1: Validating task task-001 -> Validation passed Step 2: Enriching task task-001 -> Enriched with tags: ["high-priority"] Step 3: Scoring task task-001 -> Score calculated: 85 Workflow completed: {"task_id": "task-001", "status": "completed", "score": 85}

Fan-Out/Fan-In: Parallel Processing

Fan-out/fan-in schedules multiple activities simultaneously, waits for all to complete, then aggregates results. Think of it as a team working on independent tasks: everyone works in parallel, and you combine the results at the end.

text
┌──> [Process Item 1] ──┐ │ │ Input ──────┼──> [Process Item 2] ──┼──> Aggregate ──> Output │ │ └──> [Process Item 3] ──┘

Implementing Fan-Out/Fan-In

Here's a batch processor that analyzes multiple tasks in parallel:

python
import dapr.ext.workflow as wf def batch_analysis_workflow(ctx: wf.DaprWorkflowContext, task_ids: list[str]): """Analyze multiple tasks in parallel and aggregate scores.""" # Fan-out: Schedule all tasks in parallel parallel_tasks = [ ctx.call_activity(analyze_task, input=task_id) for task_id in task_ids ] # Fan-in: Wait for ALL tasks to complete results = yield wf.when_all(parallel_tasks) # Aggregate results total_score = sum(r["score"] for r in results) avg_score = total_score / len(results) if results else 0 return { "processed_count": len(results), "total_score": total_score, "average_score": round(avg_score, 2), "individual_results": results } def analyze_task(ctx, task_id: str) -> dict: """Analyze a single task - can run in parallel with others.""" # This runs independently of other instances score = random.randint(60, 100) return {"task_id": task_id, "score": score}

Time Savings from Parallelism

For a batch of 50 items averaging 2 seconds each:

PatternExecution Time
Sequential (chaining)100 seconds
Parallel (fan-out)~2 seconds

Pattern Selection Decision Framework

QuestionIf YesIf No
Does step B need step A's output?Chain themConsider parallel
Are items independent of each other?Fan-outChain or separate workflows
Do you need aggregated results?Fan-in with when_allReturn individual results
Must all items complete for success?when_all + error handlingwhen_any or individual try/catch
Is order significant?ChainFan-out (order undefined)

Real-World Pattern Applications

ScenarioPatternReasoning
Order: Reserve → Pay → ShipChainEach step depends on previous success
Process 100 images for thumbnailsFan-outImages are independent
ETL: Extract → Transform → LoadChainTransform needs extracted data
Send notifications to 50 usersFan-outEach notification is independent
Validate form → Save → ConfirmChainEach step depends on previous
Run tests across 10 environmentsFan-outEnvironments are independent

Reflect on Your Skill

You extended your dapr-deployment skill in Module 7.0. Does it explain workflow patterns and when to use each?

Test Your Skill

text
Using my dapr-deployment skill, help me design a workflow for processing customer orders. Each order needs: (1) inventory check, (2) payment processing, (3) shipping label creation. Should these be chained or parallel?

If your skill recommends task chaining and recognizes the sequential dependency, it's working correctly.


Try With AI

Prompt 1: Design a Chained Pipeline

text
Help me implement a Dapr workflow for document processing. The pipeline needs: 1. Parse the document (extract text and metadata) 2. Classify the document type using the parsed content 3. Route to appropriate handler based on classification Show me the complete workflow function and activities with proper data passing.

Prompt 2: Implement Parallel Processing

text
I need to analyze sentiment for 20 customer feedback items. Each analysis is independent and takes about 2 seconds. Show me how to: 1. Fan-out: Schedule all 20 analyses in parallel 2. Fan-in: Wait for all to complete with when_all 3. Aggregate: Calculate average sentiment score

Prompt 3: Pattern Selection Challenge

text
I'm building a workflow for an e-commerce checkout. The steps are: - Validate cart items are in stock - Calculate shipping cost - Apply discount codes - Process payment - Send confirmation email - Update inventory Help me decide: which steps should be chained, and which can be parallel?

Safety Note: When implementing fan-out patterns, be aware of resource limits. Scheduling 10,000 parallel tasks at once could overwhelm downstream services or exhaust workflow engine resources. For large batches, consider batching your fan-out (process 100 at a time).