I’ve been using queues a lot lately in my front end code. They’ve been great for breaking apart complex actions, and providing some stability to volatile operations.
A real-life example from G2Crowd is in our metrics gathering code. When a user interacts with our interface, we send send ajax calls to our analytics system. These metrics are used to track adoption of our features, and help us respond to problem areas in the application.
This can be tricky, because the events that we are tracking are pretty volatile. For instance, when a user clicks on a link, the browser will tear down the current page and cancel any active ajax requests. As a result we end up dropping quite a few data points.
Enter the queue. Instead of directly running our function when the user clicks on the link, we can instead push some data into a list that will be processed later. Not only is this a great tool for pushing complex interactions into the background, it is also good for dealing with unexpected failures and timeouts.
With a little bit of planning, our queue can even be serialized into LocalStorage or Cookies. This means that our code will work across page refreshes!
So let’s start with the queue. A queue is simply a first-in-first-out list, so we’ll just wrap up an array with an ‘enqueue’ and ‘dequeue’ method, and a few hooks for convenience:
This basic queue is pretty easy to use:
The Queue Manager
So now we have our list; we can now go about creating some handlers to process the list for us. Let’s create a queue manager that will deal with each item in the list in order.
This module is a bit longer than the previous one, and may be a little daunting. After the full listing, I will break it down function by function.
First, the module takes two parameters; a queue that it will be managing, and a callback function for processing each individual entity in the list:
The ‘flush’ function is what we call to start clearing out the queue. We want to make sure that we can always call this function safely, so we use a flag to avoid flushing multiple times:
The ‘process’ function is what does most of the work for us. If there are any items in the list, then we call our handling callback with everything needed to process the current item. Otherwise we stop flushing and clean up.
The ‘next’ function is called to advance the queue forward, and start working on the next item. We pass this function to our callback, to give us a way to complete items asynchronously.
The ‘fail’ function will simply try to process the current item again. Notice that we keep count of failures, and pass the number to our hooks. This gives us an easy way to manage bad queue values if we need to later. This function is also passed to our handler callback.
In order to kick off our queue processing, we use the ‘onEnqueue’ hook to begin flushing as soon as any item is added to our queue. Notice that we are careful to avoid overwriting any previous onEnqueue handlers that have been assigned:
And finally, we return our public interface. We provide direct access to the flush call, as well as a few hooks:
Now that we’ve walked through the code piecemeal, it should be relatively easy to see how to consume it. We simply pass in our queue, and call the ‘next’ or ‘fail’ functions within our callback to advance the queue forward:
Using the Queue to send AJAX Requests
Now let’s get back to our analytics logging. Let’s say that our analytics code just sends an ajax request to an endpoint:
With our current queue manager, we can create an analyticsEvents queue to hold all events that we want to record. Our manager will be able to send the ajax for us, and will also guarantee that the ajax calls finishes successfully. In order to do this, all we have to do is pass the ‘next’ and ‘fail’ functions directly to the ajax handlers:
Now our logEvents function simply pushes items into the queue. If the ajax call fails for whatever reason, we’ll just try again, and once we’ve completed logging an event we’ll move on to the next one.
Storing the Queue in LocalStorage
So what happens if the page is refreshed? As the code currently stands, we would lose our queue, as well as any data points that we had hoped to capture.
Thanks to LocalStorage, this is very easy to deal with. We can use the hooks on our queue to save a copy of the list every time it is modified. The queue will be loaded from LocalStorage when the page is ready:
So now we can pass this persistent version of our queue into the existing queueManager object:
If the ajax call fails or gets canceled, then it will not be removed from the list. If the user refreshes the page, our code will retrieve the stored queue from localStorage and try again.
Regarding using LocalStorage directly like - this will fail across some browsers. We use a very simple polyfill on our site to improve compatibility.
Where to from Here?
There are a few additions that we could add to our queue manager to clean it up even further. In my production version of this code, I set up a timeout to call ‘fail()’ if an item is taking too long to process. We also set up a retry limit, so that after X number of failures we continue on to the next item. Finally, we catch exceptions from processing the queue directly in our manager, so that a single bad item won’t sabotage our entire list of commands.
The beauty of the queue is that these relatively complex interactions are split off into the queue manager. This makes them easy to test and reason about directly, without becoming tangled up with our business logic.