Member-only story
Throttling Concurrent Outgoing HTTP Requests in .NET Core
Use a semaphore
in C# to limit the maximum concurrent tasks

In my last post, we implemented the rate limiting for an API endpoint on the server side. The server side rate limiting is useful for API providers to ensure system performance and/or to realize business values. On the client side, the API consumers then should throttle the rate of concurrent HTTP requests in order to comply with the rate limits of the endpoints and moderate the usage of client side resources.
This post will go over how to make concurrent outgoing HTTP requests on the client side. The goal is to let the HTTP Client send concurrent requests at the maximum allowed rate which is set by the server, for example, at a maximum rate of 2 requests per second.
We will use a semaphore
in C# to limit the maximum concurrent tasks. The demo project is a .NET Core Console application, which takes advantage of the native dependency injection (DI) system and the new HttpClient
released since .NET Core 2.1. The full project is located in this GitHub repository.
Setting Up a .NET Core Console App with a Typed HttpClient
For demo purposes, we are going to create a small .NET Core Console app. Two necessary NuGet packages are Microsoft.Extensions.DependencyInjection
and Microsoft.Extensions.Http
. The first one allows us to take advantage of the DI system, and the second one provides the HttpClient
.
We modify the Main()
method as below. Notice that this Main()
method returns a Task
in async
mode, which requires us to configure the project to build with a C# version of 7.1 or up.
Lines 3 to 5 register all dependencies and build a service provider. Then we can use the service provider to resolve dependencies for the services we want to use. Line 7 produces an instance of a Typed HttpClient, IThrottledHttpClient
, which will be implemented in the next section. Line 9 then instructs the…