Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Short Walks — Object Locking in C#

DZone's Guide to

Short Walks — Object Locking in C#

Read this short and to the point article about how to implement a thread locking mechanism that doesn't queue.

· Integration Zone ·
Free Resource

Continue to drive demand for API management solutions that address the entire API life cycle and bridge the gap to microservices adoption.  

While playing with Azure Event Hubs, I decided that I wanted to implement a thread locking mechanism that didn't queue. That is, I want to try and get a lock on the resource, and if it's currently in use, just forget it and move on. The default behavior in C# is to wait for the resource. For example, consider my method:

static async Task MyProcedure()
{
    Console.WriteLine($"Test1 {DateTime.Now}");
    await Task.Delay(5000);
    Console.WriteLine($"Test2 {DateTime.Now}");
}

I could execute this five times like so:

static async Task Main(string[] args)
{
    Parallel.For(1, 5, (a) =>
    {
        MyProcedure();
    });

    Console.ReadLine();
}

If I wanted to lock this (just bear with me and assume that makes sense for a minute), I might do this:

private static object _lock = new object();        

static async Task Main(string[] args)
{
    Parallel.For(1, 5, (a) =>
    {
        //MyProcedure();
        Lock();
    });

    Console.ReadLine();
}

static void Lock()
{
    Task.Run(() =>
    {
        lock (_lock)
        {
            MyProcedure().GetAwaiter().GetResult();
        }
    });
}

I re-jigged the code a bit because you can't wait inside a lock statement, and obviously, just making the method call synchronous would not be locking the asynchronous call.

So now, I've successfully made my asynchronous method synchronous. Each execution of "MyProcedure" will happen sequentially, and that's because "lock" queues the locking calls behind one another.

However, imagine the Event Hub scenario that's referenced in the post above. I have, for example, a game, and it's sending a large volume of telemetry up to the cloud. In my particular case, I'm sending a player's current position. If I have a locking mechanism whereby the locks are queued, then I could potentially get behind; and if that happens, then, at best, the data sent to the cloud will be outdated, and worse, it will use up game resources, potentially causing a lag.

After a bit of research, I found an alternative:

private static object _lock = new object();        

static async Task Main(string[] args)
{
    Parallel.For(1, 5, (a) =>
    {
        //MyProcedure();
        //Lock();
        TestTryEnter();
    });

    Console.ReadLine();
}

static async Task TestTryEnter()
{
    bool lockTaken = false;

    try
    {
        Monitor.TryEnter(_lock, 0, ref lockTaken);

        if (lockTaken)
        {
            await MyProcedure();                                        
        }
        else
        {
            Console.WriteLine("Could not get lock");
        }
    }
    finally
    {
        if (lockTaken)
        {
            Monitor.Exit(_lock);
        }
    }
}

Here, I try to get the lock, and if the resource is already locked, I simply give up and go home. There are obviously a very limited number of uses for this; however, my Event Hub scenario, described above, is one of them. Depending on the type of data that you're transmitting, it may make much more sense to have a go, and if you're in the middle of another call, simply abandon the current one.

Discover how organizations are modernizing their application architectures for speed and agility from the growing API economy

Topics:
c# ,object ,locking ,azure ,event ,hubs ,integration ,queue

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}