How to Rate Limit Request in ASP.NET Core - Windows ASP.NET Core Hosting 2024 | Review and ComparisonWindows ASP.NET Core Hosting 2024 | Review and Comparison

Performance is undoubtedly a concern if you’re creating a website or public API. Rate limitation is rather simple to set up for Blazor/ASP.NET Core applications, and it can assist prevent misuse from simple DDoS assaults.

Why Rate Limit Requests?

There are several justifications for rating requests. As no sane human would be making 100 requests per second for ten minutes on end, most services should definitely have some sort of rate limit. Your application will reply to all requests by default, therefore it’s a good idea to set a fair restriction.

Naturally, DDoS protection might also be provided by your cloud provider. This will often provide strong defense against server-targeting Layer 3 and Layer 4 assaults. You should nonetheless, however, ensure that your server takes all reasonable precautions to prevent hackers from accessing it.

To restrict requests on public APIs, you can also set the limit significantly lower. For instance, it could take a long time for a particular endpoint to process the request. In order to lessen the strain on your server and database, you might want to restrict access to this endpoint such that no IP address is able to send more than a few queries per few seconds.

Setting Up Rate Limiting In ASP.NET Core

The foundation of the Blazor framework is ASP.NET Core, which manages all of the technical aspects of hosting an HTTP server and handling requests. Therefore, rate limitation for ASP.NET Core needs to be configured. Those who are not using Blazor will follow the same procedures.

Unfortunately, rate restriction isn’t included by default in ASP.NET Core. Nonetheless, there is a highly well-liked NuGet package,

AspNetCoreRateLimit

this accomplishes the task admirably. Installing it is as simple as right-clicking your Visual Studio project and choosing “Manage NuGet Packages…”

Search for

AspNetCoreRateLimit

and install it.

There are several approaches to rate limitation. We advise rate limiting based on API key if you’re using a key-required API, as it covers all scenarios. AspNetCoreRateLimit recommends rate limiting based on IP address by default, which is probably OK for the majority of users.

It must be added to ASP.NET as a service. Every service is set up in

Startup.cs

, which adds them with the

ConfigureServices(IServiceCollection services)

function.

Numerous services need to be configured. The services are set up to load configurations from your configuration file by the first function. If you haven’t already, you should also add Microsoft’s memory cache. After that, you must add the rate limiter and configure IpRateLimiting using the JSON file.

// needed to load configuration from appsettings.json
services.AddOptions();
// needed to store rate limit counters and ip rules
services.AddMemoryCache();
//load general configuration from appsettings.json
services.Configure(Configuration.GetSection("IpRateLimiting"));
// inject counter and rules stores
services.AddInMemoryRateLimiting();
// configuration (resolvers, counter key builders)
services.AddSingleton<IRateLimitConfiguration, RateLimitConfiguration>();

Also in

Startup.cs

, you must set up IP rate limiting in the application builder.

app.UseIpRateLimiting();

Remember that this makes use of per-instance in-memory rate limiting. This package also supports Redis, a distributed memory store that you’ll need to use if you’re load balancing your application.

Configuring Rate Limiting

To configure it after it’s added to ASP.NET, go to your appsettings.json configuration file. This is how the configuration appears to be configured:

"IpRateLimiting": {
"EnableEndpointRateLimiting": false,
"StackBlockedRequests": true,
"RealIpHeader": "X-Real-IP",
"ClientIdHeader": "X-ClientId",
"HttpStatusCode": 429,
"IpWhitelist": [ "127.0.0.1", "::1/10", "192.168.0.0/24" ],
"EndpointWhitelist": [ "get:/api/license", "*:/api/status" ],
"ClientWhitelist": [ "dev-id-1", "dev-id-2" ],
"GeneralRules": [
{
"Endpoint": "*",
"Period": "1s",
"Limit": 2
},
{
"Endpoint": "*",
"Period": "15m",
"Limit": 100
},
{
"Endpoint": "*",
"Period": "12h",
"Limit": 1000
},
{
"Endpoint": "*",
"Period": "7d",
"Limit": 10000
}
]
}

First, you should enable EnableEndpointRateLimiting, which is false by default, if you intend to rate limit specific endpoints differently.

Any blocked requests will be added to the counter by StackBlockedRequests. In essence, if this is turned off, people who repeatedly make requests will only receive X responses in a given time frame. When it’s turned on, they’ll quickly reach their maximum number of responses before receiving no more responses.

RealIpHeader and ClientIdHeader are utilized in common setups where your server is situated behind a reverse proxy. The proxy sets a header with the user’s actual information since the proxy server will always be making the requests. The rate limiter will treat all IP addresses as the same if this header is not set correctly, so you’ll need to check your proxy.

Then, one each for IP addresses, client IDs, and endpoints is one of the three whitelists. If these are not needed, you can remove them.

After that, each endpoint, along with a period and limit, needs to be configured. When EnableEndpointRateLimiting is set to false, the only thing that will function is a wildcard. If not, you can use {HTTP_VERB}{PATH} to define endpoints, including wildcards, in which case /api/values will match all GET and POST requests with the identifier *:/api/values.

Make sure that your endpoint corresponds to a file rather than a directory. Because of the trailing slash, *:/download/*/*/ was not a valid endpoint in my case, but *:/download/*/*/ was.

If you’re testing, you’ll need to comment out the IP whitelist for localhost included in this default configuration. However, you should be able to test your setup by making a lot of requests and setting a very low limit, such as 5 per minute. This error, “API calls quota exceeded,” should appear, indicating that everything is operating as it should.