The deferred Promise antipattern

What is it and why you want to avoid it

ยท

5 min read

Have you ever heard about deferred Promise? If you are a seasoned Javascript developer like me, I bet you did. If it's not the case, then you're lucky!

Back at the time when I had to migrate a huge codebase from Q promises to ES6 promises, such codebase made an extensive use of Q.defer. If I had to give you a definition of what a deferred promise is, I would say:

A deferred promise is a promise that exposes its resolve and reject methods externally.

Let's see this definition in practice, step by step.

The rabbit hole to deferred promises

A typical use of an ES6 Promise is the following:

const rollCritsOnly = () => new Promise((res, rej) => {
    const d20Roll = Math.floor(Math.random() * 20) + 1;
    return d20Roll === 20 
        ? res("Critical hit!")
        : rej("Dungeon Master is laughing at you!");
});

const roll1 = await rollCritsOnly();
const roll2 = await rollCritsOnly();

Each roll creates a new Promise and awaits it. Once the await is over, the Promise object can be garbage collected, as there are no longer references to it. But what happens when I use a constant value instead of an arrow function?

// that is a single promise object!
const rollCritsOnly = new Promise((res, rej) => {
    const d20Roll = Math.floor(Math.random() * 20) + 1;
    return d20Roll === 20 
        ? res("Critical hit!")
        : rej("Dungeon Master is laughing at you!");
});

const roll1 = await rollCritsOnly;
const roll2 = await rollCritsOnly;

That's easy, the promise is actually invoked only on first roll. What is worth noticing, is that the promise keeps returning its settlement value on every await! This is an interesting property and that is by design, how stated from this excerpt from this MDN page:

These [ .then ] callbacks will be invoked even if they were added after the success or failure of the asynchronous operation that the promise represents.

Now, let's take things one step further:

// file: diceRoller.js
function newDeferredPromise() {
  let resolve, reject;
  const promise = new Promise((res, rej) => {
    [resolve, reject] = [res, rej];
  });
  return {promise, reject, resolve};
}

export const deferred = newDeferredPromise();

const d20Roll = Math.floor(Math.random() * 20) + 1;
if (d20Roll === 20) {
    deferred.resolve("Critical hit!")
} else {
    deferred.reject("Dungeon Master is laughing at you!")
}

// file: client.js
const rollResult = await deferred.promise;

Because of the property we just described, this code works as before. Every other client awaiting that deferred.promise will settle with last and only roll result ever made. We just externalized the resolve and reject functions.

But...is it good or bad?

Even though this is technically possible, we should evaluate how worse our code got. First, our code got more complicated as it is harder to shape the flow in our minds. Try it yourself.

Secondly, newcomers may not be aware of promise settlement value retention and may use the deferred antipattern without paying much attention and expecting a new roll on every await. It's like thinking of promises like they were settlement emitters or, best case, considering them like settlement value lockers, while instead they are just...promises. I give a task to a friend and he promises me he will settle it, either with a success or a failure. Once settled, no one will ever talk about it anymore. If I need another task, I will ask this friend another promise.

And...how can we leverage deferred?

Please don't leverage them. ES6 promises got us free from the cursed deferred antipattern promoted by Q. But is important to understand the problems it was supposed to solve.

The main reason I saw it used, was to adapt a pub-sub to the promise programming model. Suppose you have an event emitter that invokes a callback when data is downloaded. For now, let's assume data can be downloaded only once, say on boot.

// file: low-level.js
const deferred = newDeferredPromise(); // or Q.defer()
function boot() {
    dataEmitter.onDataDownloaded((data) => {
        // suppose data is downloaded only once
        deferred.resolve(data)
    });
}

You may have some async code that waits until data is downloaded:

// file: low-level.js
async function getDownloadedData() {
    const myData = await deferred.promise;
     // could be a validation or whatever
    const manipulatedData = manipulate(myData);
    return manipulatedData;
}

// file: client.js
boot();
const data = await getDownloadedData()

Here it is, pub-sub turned into promise model! Nice! No, not really. If we drop our assumption and make the event onDataDownloaded called more than just once, next events won't trigger any promise resolution. And probably no one would be awaiting this new resolution. This is even more painful if onDataDownloaded is usually called once, but in case of server restart or some other drastic events, the download can be triggered again. Everything works just fine, but sometimes not. Welcome to hell, cowboy!

A client disaster recovery could look like this:

// file: client.js
boot()
const data = await getDownloadedData();
initSystemData(data);
// fired when server is online again
service.onServerOnline(() => {    
    // โ›” this will resolve with old data
    const data = await getDownloadedData();
    initSystemData(data);
});

We could also think of something more fancy and complex, for example we could restart the whole low level machinery. But this is not always possible and honestly would be just a tentative of making our way around the real problem: the deferred.

The solution I figured out was to use standard ES6 promises as they are meant to be used, that is just like they were...promises. And also leveraged an event emitter to help with disaster recovery.

// file: low-level.js
const helperEmitter = new EventEmitter();
function boot() {
    dataEmitter.onDataDownloaded((data) => {
        helperEmitter.emit(data)
    });
}

function whenDownloaded() {
    return new Promise((res) => {
        const unsub = helperEmitter.subscribe((data) => {
            unsub();
            res(data); 
        });
    });
}

// file: client.js
boot()
const data = await whenDownloaded();
initSystemData(data);
// fired when server is online again
service.onServerOnline(() => {
    // ๐Ÿ‘Œ this will return fresh data every time a download is triggered
    const data = await whenDownloaded();
    initSystemData(data);
});

Conclusions

I remember it was quite an exciting journey and eventually we made our system more reliable after this whole migration. Fortunately, I don't see deferred anymore and hopefully this damn thing got lost in the sand of time. Nonetheless, it is still technically possible and so is worth to know what it is and how much it can hurt.

So, stay true to ES6 promises and don't try to use them in a fancy way, or they will bite back.

References

ย