Why does the implicit type conversion not work in template deduction?

  • A+

In the following code, I want to call a template function by implicitly converting an int to a Scalar<int> object.

#include<iostream> using namespace std;  template<typename Dtype> class Scalar{ public:   Scalar(Dtype v) : value_(v){} private:   Dtype value_; };  template<typename Dtype> void func(int a, Scalar<Dtype> b){    cout << "ok" <<endl; }  int main(){   int a = 1;   func(a, 2);    //int b = 2;   //func(a, b);   return 0; } 

Why does the template argument deduction/substitution fail? And the commented-codes are also wrong.

test.cpp: In function ‘int main()’: test.cpp:19:12: error: no matching function for call to ‘func(int&, int)’    func(a, 2);             ^ test.cpp:19:12: note: candidate is: test.cpp:13:6: note: template<class Dtype> void func(int, Scalar<Dtype>)  void func(int a, Scalar<Dtype> b){       ^ test.cpp:13:6: note:   template argument deduction/substitution failed: test.cpp:19:12: note:   mismatched types ‘Scalar<Dtype>’ and ‘int’    func(a, 2); 


Because template argument deduction is not that smart: it does not (by design) consider user-defined conversions. And int -> Scalar<int> is a user-defined conversion.

If you want to use TAD, you need to convert your argument at the caller site:

func(a, Scalar<int>{2});  

or define a deduction guide1 for Scalar and call f:

func(a, Scalar{2}); // C++17 only 

Alternatively, you can explicitly instantiate f:

func<int>(a, 2);  

1) The default deduction guide is sufficient: demo.


:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen: