Search code examples
typescriptinheritancedecorator

Typescript decorators with inheritance


I am playing around with Typescript decorators and they seem to behave quite differently than what I'd expect when used alongside with class inheritance.

Suppose I have the following code:

class A {
    @f()
    propA;
}

class B extends A {
    @f()
    propB;
}

class C extends A {
    @f()
    propC;
}

function f() {
    return (target, key) => {
        if (!target.test) target.test = [];
        target.test.push(key);
    };
}

let b = new B();
let c = new C();

console.log(b['test'], c['test']);

Which outputs:

[ 'propA', 'propB', 'propC' ] [ 'propA', 'propB', 'propC' ]

Though I'd expect this:

[ 'propA', 'propB' ] [ 'propA', 'propC' ]

So, it seems that target.test is shared between A, B and C. And my understanding of what is going on here is as follow:

  1. Since B extends A, new B() triggers the instantiation of A first, which triggers the evaluation of f for A. Since target.test is undefined, it is initialized.
  2. f is then evaluated for B, and since it extends A, A is instantiated first. So, at that time, target.test (target being B) references test defined for A. So, we push propB in it. At this point, things go as expected.
  3. Same as step 2, but for C. This time, when C evaluates the decorator, I would expect it to have a new object for test, different than that defined for B. But the log proves me wrong.

Can anyone explain to me why this is happening (1) and how would I implement f such that A and B have separate test properties ?

I guess you'd call that an "instance specific" decorator ?


Solution

  • Alright, so after spending a few hours playing around and searching the web, I got a working version. I don't understand why this is working, so please forgive the lack of explanation.

    The key is to use Object.getOwnPropertyDescriptor(target, 'test') == null instead of !target.test for checking the presence the test property.

    If you use:

    function f() {
        return (target, key) => {
            if (Object.getOwnPropertyDescriptor(target, 'test') == null) target.test = [];
            target.test.push(key);
        };
    }
    

    the console will show:

    [ 'propB' ] [ 'propC' ]
    

    Which is almost what I want. Now, the array is specific to each instance. But this means that 'propA' is missing from the array, since it is defined in A. Hence we need to access the parent target and get the property from there. That took me a while to figure out, but you can get it with Object.getPrototypeOf(target).

    The final solution is:

    function f() {
        return (target, key) => {
            if (Object.getOwnPropertyDescriptor(target, 'test') == null) target.test = [];
            target.test.push(key);
    
            /*
             * Since target is now specific to, append properties defined in parent.
             */
            let parentTarget = Object.getPrototypeOf(target);
            let parentData = parentTarget.test;
            if (parentData) {
                parentData.forEach(val => {
                    if (target.test.find(v => v == val) == null) target.test.push(val);
                });
            }
        };
    }
    

    Which outputs

    [ 'propB', 'propA' ] [ 'propC', 'propA' ]
    

    May anyone that understands why this works while the above doesn't enlighten me.