How to design a Priority Queue in C#

How to design a Priority Queue in C#

What is a Priority Queue?

To be a bit formal, a priority queue is an abstract data type (ADT), similar to a regular queue, but where each element has a priority associated to it. An element with high priority is served before an element with a low priority. Elements which have the same priority are usually served in the order they were added, but that is not a formal requirement.

In other words, it’s like the queue to the fancy club down town, where all we regular people have to stay in line outside and the VIPs just walks right in.

Choosing the internal collection type

Since we are designing a queue, we want a method for Enqueuing elements and a method for Dequeuing elements. Somehow we need to keep track of the elements that are in the queue and the straight forward way to do this is to use a Collection type. One commonly used Collection type that supports both adding and removing elements is the generic List<T> from the System.Collections.Generic namespace. Sounds like a good fit? Yes! Let’s write some code:

public class PriorityQueue<T>
{
  private readonly List<T> _pq = new List<T>();
}

Making sure enqueued elements can be prioritized

In order to know which element that has the highest priority we need to be able to prioritize one against the other in some way. But how can we put such restrictions on the elements from within the PriorityQueue class? Fortunately there is a way in C# to put constraints on generic types (in our case our generic type is T). We can do this by informing the compiler that objects of type T must be comparable to each other so they can be sorted, in other words, T must implement IComparable (part of the .NET framework). Let’s add that restriction to our PriorityQueue:

public class PriorityQueue<T> where T : IComparable<T>
{
  private readonly List<T> _pq = new List<T>();
}

Enqueuing and Dequeuing elements

A queue is useless if you can’t enqueue and dequeue items from it. It will get a bit tricky here in a while when we will need to ensure elements are added in the right position in regards to priority, but let’s start with something simple. Let’s pretend that we don’t care about priority and just want a regular queue. This can be achieved by adding elements to the end of the List and removing them at the beginning:

public class PriorityQueue<T> where T : IComparable<T>
{
  private readonly List<T> _pq = new List<T>();
  
  public void Enqueue(T item)
  {
    _pq.Add(item);
  }

  public T Dequeue()
  {
    var item = _pq[0];
    _pq.RemoveAt(0);
  
    return item;
  }
}

Cool, now we have a regular queue, but how do we ensure we always Dequeue the top prioritized item? One way we could do that is to sort the items after each time we add a new item:

public class PriorityQueue<T> where T : IComparable<T>
{
  private readonly List<T> _pq = new List<T>();
  
  public void Enqueue(T item)
  {
    _pq.Add(item);
    _pq.Sort();
  }

  public T Dequeue()
  {
    var item = _pq[0];
    _pq.RemoveAt(0);
    
    return item;
  }
}

This should work, and it’s a descent solution. However, sorting the List<T> like this every time an element is enqueued is not the optimal solution. We can do it better.

Making it scale

Now we have to dive into the realms of Computer Science. If concepts like Big O notation and Binary Heaps just are strange words that you don’t know what they mean I recommend reading up on those first and then returning here. You can find an introduction to Big O notation here and a good explanation of Binary Min and Max Heaps here.

All ready to go? Great! So, using the solution above we get an O(nlogn) dependency when Enqueuing elements, that is due to the sort that occurs after each addition. However, if we order the data in the List<T> as a Binary Min Heap both the Enqueue and Dequeue operations can be improved to O(logn) which scales much better.

I will not explain how the Insert and Delete operations work in detail in a Binary Min Heap. You can find a good explanation, with fancy animations, by following the link above. Instead, lets look at the resulting code:

public class PriorityQueue<T> where T : IComparable<T>
{
  private readonly List<T> _pq = new List<T>();

  public void Enqueue(T item)
  {
    _pq.Add(item);
    BubbleUp();
  }
  
  public T Dequeue()
  {
    var item = _pq[0];
    MoveLastItemToTheTop();
    SinkDown();
    return item;
  }

  private void BubbleUp() // Implementation of the Min Heap Bubble Up operation
  {
    var childIndex = _pq.Count - 1;
    while (childIndex > 0)
    {
      var parentIndex = (childIndex - 1) / 2;
      if (_pq[childIndex].CompareTo(_pq[parentIndex]) >= 0)
        break;
      Swap(childIndex, parentIndex);
      childIndex = parentIndex;
    }
  }

  private void MoveLastItemToTheTop()
  {
    var lastIndex = _pq.Count - 1;
    _pq[0] = _pq[lastIndex];
    _pq.RemoveAt(lastIndex);
  }

  private void SinkDown() // Implementation of the Min Heap Sink Down operation
  {
    var lastIndex = _pq.Count - 1;
    var parentIndex = 0;
    
    while (true)
    {
      var firstChildIndex = parentIndex * 2 + 1;
      if (firstChildIndex > lastIndex)
      {
        break;
      }
      var secondChildIndex = firstChildIndex + 1;
      if (secondChildIndex <= lastIndex && _pq[secondChildIndex].CompareTo(_pq[firstChildIndex]) < 0)
      {
        firstChildIndex = secondChildIndex;
      }
      if (_pq[parentIndex].CompareTo(_items[firstChildIndex]) < 0)
      {
        break;
      }
      Swap(parentIndex, firstChildIndex);
      parentIndex = firstChildIndex;
    }
  }

  private void Swap(int index1, int index2)
  {
    var tmp = _pq[index1];
    _pq[index1] = _pq[index2];
    _pq[index2] = tmp;
  }
}

There you have it! A fully working Priority Queue implementation in C# that scales.

You can find a, very similar but not quite identical, implementation on my GitHub page: https://github.com/Backhage/PriorityQueue