Search code examples
typescripttypesintersectionextends

Why is the generic type parameter inferred differently for an extended interface and for a type alias of an intersection of interfaces?


In the following toy experiment (simplified from a real-world example), why is the generic type parameter inferred differently depending on whether the template is instantiated with an extended type or with an intersected type?

interface Base { b: number }
interface Extra { a: string }
interface Ext1 extends Extra { b: number }
type Ext2  = Base & Extra

// f returns a function that takes a T as input
const f = <T extends Base>(inp: T & Extra): ((arg: T) => void) => {
    return (arg: T) => console.log(inp.a + arg.b) 
}

const x1: Ext1 = { a: "x1", b: 1 }
const x2: Ext2 = { a: "y1", b: 2 } 

const f1 = f(x1) // T inferred to Ext1
const f2 = f(x2) // T inferred to Base, NOT Ext2 (why?)

const inp = { b: 3 }

// error Argument of type '{ b: number; }' is not assignable to parameter of type 'Ext1'. Property 'a' is missing in type '{ b: number; }' but required in type 'Ext1'.
const out1 = f1(inp) 

// ok since inp is of type Base
const out2 = f2(inp)

Playground Link


Solution

  • What you encountered here is not an issue with inference but rather a side-effect of elimination of redundant intersection members. Notice the & Extra in the inp type. When f is passed a variable of type Ext2 at call site, the type of inp essentially becomes Base & Extra & Extra.

    Since identical types are eliminated from intersections, the type of inp actually becomes Base & Extra, and the type parameter T is then inferred as Base as it satisfies both the extends Base. And, indeed, if you remove the intersection of T with Extra, you will observe correct inference:

    interface Base { b: number }
    interface Extra { a: string }
    interface Ext1 extends Extra { b: number }
    type Ext2  = Base & Extra
    
    // f returns a function that takes a T as input
    const f = <T extends Base>(inp: T): ((arg: T) => void) => {
        return (arg: T) => console.log(inp.a + arg.b)
    }
    
    const x1: Ext1 = { a: "x1", b: 1 }
    const x2: Ext2 = { a: "y1", b: 2 } 
    
    const f1 = f(x1) // T inferred to Ext1
    const f2 = f(x2) // T inferred to Ext2
    
    const inp = { b: 3 }
    
    const out1 = f1(inp) // error
    const out2 = f2(inp) // error
    

    With that out of the way, let's clear a small misconception. Extending from an interface works similarly to intersecting two interfaces, but they are not the same. extends means that the left-hand side type (interface) is a subtype (or, in other terms, is narrower), and the right-hand side type (interface) is a supertype (or is wider).

    On the contrary, intersections create a combination of types. Take a look at the example below to see the crucial difference between extends and & in action:

    interface A { a: string, b: boolean }
    interface B { a: number, b: boolean }
    interface C extends A, B {} // error, cannot extend
    
    type a = { a:string, b: boolean }
    type b = { a:number, b: boolean }
    type c = a & b; // no error, but 'a' is never
    

    That is precisely why when you intersect the Ext1 with Extra, nothing happens — there are no identical types to eliminate, only Ext1 (a subtype of Extra) and Extra (a supertype).

    Playground