Search code examples
c++templateslambdatype-conversion

Use type_identity_t in a lambda argument


I'm trying to reimplement Logan Smith's implicit_cast from Cursed C++ Casts within an expression. My approach is to combine an anonymous function with type_identity_t:

#include <iostream>
using namespace std;
int main() {
  int i = 1;
  long j = 4;
  cout << max(j, []<class T>(type_identity_t<T> x) -> T { return x; }(i));
}

But it doesn't work. The compiler (gcc 13.2) complains:

<source>: In function 'int main()':
<source>:6:70: error: no match for call to '(main()::<lambda(std::type_identity_t<T>)>) (int&)'
    6 |   cout << max(j, []<class T>(type_identity_t<T> x) -> T { return x; }(i));
      |                  ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~
<source>:6:18: note: candidate: 'template<class T> main()::<lambda(std::type_identity_t<T>)>'
    6 |   cout << max(j, []<class T>(type_identity_t<T> x) -> T { return x; }(i));
      |                  ^
<source>:6:18: note:   template argument deduction/substitution failed:
<source>:6:70: note:   couldn't deduce template parameter 'T'
    6 |   cout << max(j, []<class T>(type_identity_t<T> x) -> T { return x; }(i));
      |                  ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~

Is it salvageable? Or should I switch to some other approach?


Solution

  • As the video explains this implicit_cast must be called with an explicit template argument specifying the target type. Your lambda call lacks that. The whole point of type_identity_t<T> is to make it so that T won't be deduced automatically. It is there to force the user to specify the template argument explicitly. It doesn't have any other functional effect.

    To call the lambda with an explicit template argument you need to write

    []<class T>(type_identity_t<T> x) -> T { return x; }.operator()</*target type*/>(i)