NashTech Blog

Implementing Rate Limiting in ASP.NET Core Web API

Table of Contents
two people holding macbook pro

Rate limiting is essential for building strong and secure web APIs. It enables developers to manage the volume of requests made to an API within a set timeframe. Integrating rate limiting into your .NET Core Web API safeguards server resources, deters abuse, and promotes equitable usage among clients. In this guide, we’ll delve into rate limiting, its advantages, and how to integrate it into a .NET Core Web API using C#.

What is rate limiting?

Rate limiting manages the flow of requests to an API by placing restrictions on the number of requests a client or group can make in a specified timeframe. This technique helps prevent abuse, safeguard server resources, and maintain a seamless experience for all users.

Why is Rate Limiting Important?

Rate limiting is essential for preserving the performance, stability, and security of your web API. Here’s why it matters:

  1. Safeguarding Server Resources: Implementing rate limits is crucial in shielding your server from being inundated by an overwhelming barrage of requests from clients. This proactive measure ensures that server performance remains steady and accessible to all users, promoting a smooth experience for everyone.
  2. Safeguarding Against Abuse and Attacks: Utilising rate limiting serves as a vital defence mechanism for your API, shielding it from potential malicious assaults like distributed denial-of-service (DDoS) attacks and brute force attacks. By restricting the number of requests each client can make, you effectively lessen the impact of such malevolent actions, bolstering the security and stability of your system.
  3. Promoting Fair Usage: Rate limiting guarantees equitable utilisation of your API’s resources across all clients. By establishing sensible boundaries, you thwart any single client from monopolising server resources, thereby enabling fair access to the API for all users. This approach fosters a balanced and inclusive environment, ensuring that everyone benefits from the available resources without undue exploitation.

Implementing Rate Limiting in .NET Core Web API

Now, let’s start with the implementation details of rate limiting in a .NET Core Web API. We will use C# code examples to illustrate the concepts.

Step 1: Install the Required Packages

Before diving in, you’ll want to install the AspNetCoreRateLimit NuGet package. It’s a handy middleware for implementing rate limiting in .NET Core Web APIs. Simply open your project in Visual Studio or your favourite editor and run the following command in the Package Manager Console:

Install-Package AspNetCoreRateLimit

Step 2: Configure Rate Limiting Options

In your API’s Program.cs file, inside the Main method and add the following code to configure the rate limiting options:

// Other configurations...
builder.Services.AddMemoryCache();
builder.Services.Configure<IpRateLimitOptions>(Configuration.GetSection("IpRateLimiting"));
builder.Services.Configure<IpRateLimitPolicies>(Configuration.GetSection("IpRateLimitPolicies"));
builder.Services.AddSingleton<IIpPolicyStore, MemoryCacheIpPolicyStore>();
builder.Services.AddSingleton<IRateLimitCounterStore, MemoryCacheRateLimitCounterStore>();
builder.Services.AddSingleton<IRateLimitConfiguration, RateLimitConfiguration>();
// Other configurations...

The provided code snippet handles the registration of necessary services for rate limiting and sets up the IP rate limiting options. It utilises an in-memory cache to store rate limit counters.

Step 3: Define Rate Limiting Policies

In your appsettings.json file, add the following configuration to define your rate limiting policies:

"IpRateLimiting": {
    "EnableEndpointRateLimiting": true,
    "StackBlockedRequests": false,
    "RealIpHeader": "X-Real-IP",
    "ClientIdHeader": "X-ClientId",
    "HttpStatusCode": 429,
    "GeneralRules": [{
            "Endpoint": "*",
            "Period": "1s",
            "Limit": 10
        }
    ],
    "ClientRules": []
},
"IpRateLimitPolicies": {
    "EndpointRateLimitPolicy": {
        "ClientIdHeader": "X-ClientId",
        "Period": "1s",
        "Limit": 5,
        "Rules": [
            {
                "Endpoint": "*",
                "Period": "1s",
                "Limit": 5
            },
            {
                "Endpoint": "*",
                "Period": "1m",
                "Limit": 50
            }
        ]
    }
}

In the configuration above, we’ve established a rate limiting policy named “EndpointRateLimitPolicy“. This policy dictates that within a 1-second timeframe, a client can send a maximum of 5 requests to any endpoint. Furthermore, within a 1-minute timeframe, a client is allowed up to 50 requests to any endpoint. These values can be tailored to fit your application’s specific needs.

Step 4: Add Rate Limiting Middleware

Next up, include the rate limiting middleware in your “Main” method within the “Program.cs” file:

// Other middleware configurations...
app.UseIpRateLimiting();
// Other middleware configurations...

By invoking the “UseIpRateLimiting” method, you activate the rate limiting middleware for your application.

Step 5: Handling Rate Limit Exceeded Requests

If a client surpasses the rate limit, you have the option to customise the response behaviour by either creating a custom middleware or managing the rate limit exceeded requests directly in your controllers. Here’s an example demonstrating how you can handle such requests within a controller:

[ApiController]
[Route("api/[controller]")]
public class MyController : ControllerBase
{
    [HttpGet]
    [RateLimit("EndpointRateLimitPolicy")]
    public IActionResult MyAction()
    {
        // Your action logic...
        return Ok();
    }
}

In the provided code, the RateLimit attribute is assigned to the MyAction method, indicating the rate limiting policy to enforce. Should the rate limit be exceeded, the middleware defaults to returning a 429 status code (Too Many Requests). However, you retain the flexibility to tailor this behaviour by implementing an IRateLimitConfiguration interface.

Conclusion

Rate limiting serves as a crucial method for managing request flow within a .NET Core Web API. By incorporating rate limiting, you safeguard your server resources, prevent potential misuse, and ensure equitable usage across clients. In this piece, we’ve delved into the concept of rate limiting, underscored its significance, and illustrated how to implement it within a .NET Core Web API using the AspNetCoreRateLimit package. By adhering to the outlined steps, you can fortify the security, reliability, and efficiency of your web API.

It’s worth noting that while rate limiting is pivotal, it’s just one facet of constructing a secure and scalable API. As developers, it’s imperative to remain abreast of best practices and continually monitor and refine your API’s performance and security to furnish users with a seamless and dependable experience.

Picture of Vipul Kumar

Vipul Kumar

Leave a Comment

Your email address will not be published. Required fields are marked *

Suggested Article

Scroll to Top