NashTech Blog

Rate Limiting Middleware in a Web Framework Using .NET

Table of Contents
Rate Limiting Middleware in .NET

As the demand for APIs grows, developers face the challenge of managing request traffic effectively to prevent server overloads and abuse. Rate limiting is a common solution that ensures fair resource distribution among users while protecting servers from being overwhelmed by excessive requests. In this blog, we’ll explore the concept of rate limiting and demonstrate how to implement it in a .NET application using middleware.

What Is Rate Limiting?

Rate limiting controls how many requests a client can send to a server within a specific timeframe. If the client exceeds this limit, the server either delays further requests or rejects them with an appropriate response.

Why Use Rate Limiting?
  • Prevent Abuse: Stops malicious users or bots from spamming the server with requests.
  • Ensure Fair Usage: Distributes server resources fairly across all users.
  • Optimize Server Performance: Keeps the server responsive for legitimate users by managing traffic spikes.

Common Rate Limiting Algorithms

  1. Fixed Window: Counts requests within fixed intervals. It’s simple but can cause a burst of requests at the boundaries.
  2. Sliding Window: A more flexible approach, using a rolling window to track requests.
  3. Token Bucket: Allows bursts of traffic within a limit, then throttles based on token generation.
  4. Leaky Bucket: Smoothens request flows over time.

For this blog, we’ll implement a Fixed Window rate limiter using middleware in .NET.

Implementing Rate Limiting Middleware in .NET

Middleware in ASP.NET Core is a perfect place to implement rate limiting. Middleware intercepts HTTP requests, processes them, and either forwards them to the next component in the pipeline or generates a response.

Step 1: Create a New ASP.NET Core Project

To begin, create a new ASP.NET Core Web API project:

dotnet new webapi -n RateLimitingDemo
cd RateLimitingDemo

Step 2: Implement the Rate Limiting Middleware

Add a new class named RateLimitingMiddleware:

using Microsoft.AspNetCore.Http;
using System;
using System.Collections.Concurrent;
using System.Threading.Tasks;

public class RateLimitingMiddleware
{
    private readonly RequestDelegate _next;
    private static readonly ConcurrentDictionary<string, (int Count, DateTime StartTime)> RequestCounts = new();
    private const int MaxRequests = 5; // Maximum requests allowed
    private const int TimeWindowInSeconds = 60; // Time window in seconds

    public RateLimitingMiddleware(RequestDelegate next)
    {
        _next = next;
    }

    public async Task InvokeAsync(HttpContext context)
    {
        var clientIP = context.Connection.RemoteIpAddress?.ToString();
        if (string.IsNullOrEmpty(clientIP))
        {
            await _next(context);
            return;
        }

        var currentTime = DateTime.UtcNow;
        var requestInfo = RequestCounts.GetOrAdd(clientIP, _ => (0, currentTime));

        var timeElapsed = (currentTime - requestInfo.StartTime).TotalSeconds;

        if (timeElapsed < TimeWindowInSeconds)
        {
            if (requestInfo.Count >= MaxRequests)
            {
                context.Response.StatusCode = StatusCodes.Status429TooManyRequests;
                await context.Response.WriteAsync("Too many requests. Please try again later.");
                return;
            }

            RequestCounts[clientIP] = (requestInfo.Count + 1, requestInfo.StartTime);
        }
        else
        {
            RequestCounts[clientIP] = (1, currentTime);
        }

        await _next(context);
    }
}

Step 3: Register Middleware in the Pipeline

Open the Program.cs file and add the middleware to the request pipeline:

var builder = WebApplication.CreateBuilder(args);
var app = builder.Build();

// Add rate limiting middleware
app.UseMiddleware<RateLimitingMiddleware>();

app.MapGet("/", () => "Welcome to the rate-limited API!");

app.Run();

Step 4: Test the Middleware

Run the application using:

dotnet run

Send multiple requests to the API endpoint using a tool like curl or Postman:

curl http://localhost:5000/

After sending more than the allowed number of requests within 60 seconds, you’ll receive a 429 Too Many Requests response.

Enhancing the Middleware

The above implementation is basic. Let’s look at ways to enhance it for production use:

1. Using Redis for Distributed Rate Limiting

In distributed systems, you’ll need a shared storage mechanism to enforce rate limits across multiple servers. Redis is a great choice for this.

Install Redis Client

Add the Redis package to your project:

dotnet add package StackExchange.Redis
Implement Redis-Based Middleware

Here’s an example of using Redis for rate limiting:

using Microsoft.AspNetCore.Http;
using StackExchange.Redis;
using System;
using System.Threading.Tasks;

public class RedisRateLimitingMiddleware
{
    private readonly RequestDelegate _next;
    private readonly IConnectionMultiplexer _redis;
    private const int MaxRequests = 5;
    private const int TimeWindowInSeconds = 60;

    public RedisRateLimitingMiddleware(RequestDelegate next, IConnectionMultiplexer redis)
    {
        _next = next;
        _redis = redis;
    }

    public async Task InvokeAsync(HttpContext context)
    {
        var clientIP = context.Connection.RemoteIpAddress?.ToString();
        if (string.IsNullOrEmpty(clientIP))
        {
            await _next(context);
            return;
        }

        var db = _redis.GetDatabase();
        var key = $"rate_limit:{clientIP}";

        var requestCount = await db.StringIncrementAsync(key);
        if (requestCount == 1)
        {
            await db.KeyExpireAsync(key, TimeSpan.FromSeconds(TimeWindowInSeconds));
        }

        if (requestCount > MaxRequests)
        {
            context.Response.StatusCode = StatusCodes.Status429TooManyRequests;
            await context.Response.WriteAsync("Too many requests. Please try again later.");
            return;
        }

        await _next(context);
    }
}
Register Redis Middleware

In Program.cs, configure Redis and register the middleware:

using StackExchange.Redis;

var builder = WebApplication.CreateBuilder(args);

builder.Services.AddSingleton<IConnectionMultiplexer>(
    ConnectionMultiplexer.Connect("localhost")
);

var app = builder.Build();

app.UseMiddleware<RedisRateLimitingMiddleware>();

app.MapGet("/", () => "Welcome to the Redis-backed rate-limited API!");

app.Run();

2. Customizing Rate Limits

You can allow different rate limits based on user roles, API keys, or endpoints. For example, you might allow premium users to make more requests than free-tier users.

3. Monitoring and Logging

Integrate logging to monitor rate-limiting events and analyze usage patterns. This can help identify abusive clients or adjust rate limits based on traffic trends.

Conclusion

Rate limiting is a crucial feature for managing API traffic effectively. In this blog, we implemented a basic in-memory rate limiter and extended it with Redis for distributed systems. With these techniques, you can protect your APIs from abuse, optimize performance, and ensure fair resource allocation.

By customizing the middleware to suit your application’s needs, you can create a robust solution that scales with your system. Explore other algorithms like token bucket or sliding window for advanced use cases.

Picture of teeshajain73125e8884

teeshajain73125e8884

Leave a Comment

Your email address will not be published. Required fields are marked *

Suggested Article

Scroll to Top