typescripttypescript-generics

How do I define a object with a field that references its own keys?


Is it possible to have a type that references keys of its own object in TypeScript?

In short, I'm trying to make a type for this object:

const items = {
  one: {
    dependencies: [],
  },
  two: {
    dependencies: ['one'],
  },
  three: {
    dependencies: ['one', 'two'],
  },
  four: {
    dependencies: ['one', 'shouldfail'],
  },
};

Where the typechecker only fails on element four. Is this possible?


Solution

  • There is no specific type in TypeScript that works this way. For specific types, either you need to write out manually what the allowed keys are but then you can't support arbitrary keys, or you can allow any string key whatsoever but then you can't detect what keys were actually used in order to constrain anything.

    Instead, you'd need to define a type that's generic in the union of its keys K. It could look like this:

    type OwnDependency<K extends string> =
        { [P in K]: { dependencies: K[] } }
    

    That's a mapped type whose keys are of type K and whose property values are of type { dependencies: K[] }. This represents the relationship between the keys and the dependencies elements. For example, OwnDependency<"a" | "b"> evaluates to {a: {dependencies: Array<"a" | "b">}, b: {dependencies: Array<"a" | "b">}}.


    Unfortunately there is no inference for generic type arguments in generic types, so if you wanted to annotate a variable of type OwnDependency<K> you'd have to write out K manually, which is redundant and annoying:

    const i: OwnDependency<"a" | "b"> = {
        a: { dependencies: [] },
        b: { dependencies: ["a", "x"] } // error
        // --------------------> ~~~
    }; 
    

    That works, but you had to write "a" | "b".

    The only way to get generic type argument inference in TypeScript is with a generic function (not a generic type). So we can make things better by writing a helper generic identity function. All it does is return its input, but we can use it to infer K without writing it out.

    The idea is that you'd write

    const i = ownDependency({
        a: { dependencies: [] },
        b: { dependencies: ["a", "x"] } // error
        // --------------------> ~~~
    });
    // const i: OwnDependency<"a" | "b">
    

    and it would be just like the code above except that K is inferred for you. It's just about the same amount of writing for the user: const vbl: Type = value versus const vbl = type(value).

    So now we just need to implement ownDependency().


    As a first approach you could try this:

    const ownDependency = <K extends string>(
        o: OwnDependency<K>
    ): OwnDependency<K> => o;
    
    const i = ownDependency({
        a: { dependencies: [] },
        b: { dependencies: ["a", "x"] } // error
        //~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    });
    // const i: OwnDependency<"a" | "x">
    

    Oops! There's an error, which we want, but it's the wrong error. The compiler inferred K from the contents of dependencies instead of from the keys, so it's upset that there's a b property at all.

    To fix that we want to block inference from the array elements. There's a longstanding open feature request at microsoft/TypeScript#14829 to have some way to say "please don't infer the type argument from this particular place". There's no native support for that yet, but there are ways to get similar behavior for some use cases.

    One way to do it is to add a second type parameter constrained to the first, as shown here:

    const ownDependency = <K extends string, L extends K>(
        o: { [P in K]: { dependencies: L[] } }
    ): OwnDependency<K> => o;
    

    Here the compiler will infer K from the keys of the input, and L from the elements of the dependencies array. So L doesn't affect the inference of K. But since L is constrained to K, the compiler will check it, and complain if there's a mismatch.


    Okay, let's test it:

    const i = ownDependency({
        a: { dependencies: [] },
        b: { dependencies: ["a", "x"] } // error
        // --------------------> ~~~
    });
    // const i: OwnDependency<"a" | "b">
    

    Perfect. The compiler inferred that i is of type OwnDependency<"a" | "b"> without us needing to write out "a" | "b" ourselves. And we get the same error as before: the "x" array element is called out as a mistake.


    Playground link to code