Data migration is often complex and error-prone—especially when dealing with large volumes, legacy systems, or external APIs. Azure Durable Functions offer a reliable, scalable, and stateful approach to orchestrating such migrations.
In this guide, we’ll walk through a practical example of using Durable Function Orchestrators for a clean, efficient migration. You’ll see step-by-step instructions and code samples you can adapt to your own systems.
Why Use Durable Functions?
Common data migration tasks include:
- Reading data in batches
- Transforming records
- Writing to a new system
- Retrying failed steps
- Tracking progress
Durable Functions make this easier by:
- Reusing existing APIs for data fetching and transformation
- Coordinating steps in a defined sequence
- Automatically saving progress between executions
- Handling retries and exceptions gracefully
- Enabling parallel processing when needed
- Scaling automatically to match workload demand
Architecture Overview
[HTTP or Timer Trigger]
↓
[Orchestrator Function]
↓
[Fetch Batch] → [Migrate Batch]
Use Case: Migrating Customers in Batches
In this example, we’ll migrate customer data from a legacy API to an Azure SQL Database in batches of 100 records.
Step-by-Step Implementation
1. Install Durable Task Extension
dotnet add package Microsoft.Azure.WebJobs.Extensions.DurableTask
2. Start Migration via HTTP Trigger
[FunctionName("StartMigration")]
public static async Task<IActionResult> StartMigration(
[HttpTrigger(AuthorizationLevel.Function, "get")] HttpRequest req,
[DurableClient] IDurableOrchestrationClient starter,
ILogger log)
{
var instanceId = await starter.StartNewAsync("MigrateOrchestrator", null);
return starter.CreateCheckStatusResponse(req, instanceId);
}
3. Define the Orchestrator Function
[FunctionName("MigrateOrchestrator")]
public static async Task RunOrchestrator(
[OrchestrationTrigger] IDurableOrchestrationContext context)
{
int page = 0;
const int pageSize = 100;
while (true)
{
var batch = await context.CallActivityAsync<List<CustomerDto>>(
"FetchCustomersBatch", new { Page = page, PageSize = pageSize });
if (batch == null || batch.Count == 0) break;
await context.CallActivityAsync("MigrateCustomerBatch", batch);
page++;
}
}
4. Fetch Customers from Legacy API
[FunctionName("FetchCustomersBatch")]
public static async Task<List<CustomerDto>> FetchCustomersBatch(
[ActivityTrigger] IDurableActivityContext context)
{
var input = context.GetInput<dynamic>();
return await LegacyApi.GetCustomersAsync((int)input.Page, (int)input.PageSize);
}
5. Migrate Customers to New System
[FunctionName("MigrateCustomerBatch")]
public static async Task MigrateCustomerBatch(
[ActivityTrigger] List<CustomerDto> customers, ILogger log)
{
foreach (var customer in customers)
{
try
{
await NewDatabase.InsertCustomerAsync(customer);
}
catch (Exception ex)
{
log.LogError($"Failed to migrate customer {customer.Id}: {ex.Message}");
}
}
}
Enhancements to Consider
- Retry policies: Handle transient errors without manual restarts
- Dead-lettering: Log failed records for future reprocessing
- Notifications: Send progress updates via email or SignalR
- Sub-orchestration: Break down complex logic into smaller, manageable workflows
- API reuse: Use existing data/service APIs to reduce new code and avoid duplication
Monitoring and Observability
You can monitor function execution and progress through:
- Azure Portal: Go to your Function App → Durable Functions
- Application Insights: View logs, telemetry, and custom events
- Status Endpoint: Use the HTTP response from the starter function to track execution
Summary
| Feature | Durable Function Support |
|---|---|
| Retries | Yes |
| Stateful Execution | Yes |
| Parallel processing | Yes |
| Sub-orchestration | Yes |
| Long-running workflows | Yes |
| API reuse support | Yes |