Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

The Case of the Queue in the Pool

DZone's Guide to

The Case of the Queue in the Pool

Here's a neat example of a bug that weaseled its way into a connection pool, and the results that occurred.

· Performance Zone
Free Resource

Evolve your approach to Application Performance Monitoring by adopting five best practices that are outlined and explored in this e-book, brought to you in partnership with BMC.

Recently we found a bug in the way our connection pool. The code in question?

public HttpClient GetClient(TimeSpan timeout, OperationCredentials credentials, Func<HttpMessageHandler> handlerFactory)
{
    var key = new HttpClientCacheKey(timeout, credentials);
    var queue = cache.GetOrAdd(key, i => new ConcurrentQueue<Tuple<long, HttpClient>>());

    Tuple<long, HttpClient> client;
    while (queue.TryDequeue(out client))
    {
        if (Stopwatch.GetTimestamp() - client.Item1 >= _maxIdleTimeInStopwatchTicks)
        {
            client.Item2.Dispose();
            continue;
        }
        client.Item2.CancelPendingRequests();
        client.Item2.DefaultRequestHeaders.Clear();
        return client.Item2;
    }

    return new HttpClient(handlerFactory())
    {
        Timeout = timeout
    };
}
public void ReleaseClient(HttpClient client, OperationCredentials credentials)
{
    var key = new HttpClientCacheKey(client.Timeout, credentials);
    var queue = cache.GetOrAdd(key, i => new ConcurrentQueue<Tuple<long, HttpClient>>());
    queue.Enqueue(Tuple.Create(Stopwatch.GetTimestamp(), client));
}

Can you see the bug?

I'll let you think about it for a while.

Done?

Okay, here we go!

The basic idea is that we'll keep a bunch of client connected, and free them if they are too old.

The issue is with the use of the concurrent queue. Consider the case of having a sudden spike in traffic, and we suddenly need 100 connections. This code will generate them, and then put them back in the pool.

We then go back to normal level, only handle 20 concurrent requests, but because we are using a queue, we'll actually cycle all the 100 requests we have through the queue all the time, keeping all of them opened.

Learn tips and best practices for optimizing your capacity management strategy with the Market Guide for Capacity Management, brought to you in partnership with BMC.

Topics:
concurrency ,queue ,pool

Published at DZone with permission of Oren Eini, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

THE DZONE NEWSLETTER

Dev Resources & Solutions Straight to Your Inbox

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.

X

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}