Search code examples
c++initializationc++17list-initialization

Emulate aggregate initialization for recursively-defined array-like class


Consider:

template <std::size_t r,std::size_t d>
struct Tensor
{
  Tensor<r-1u,d> Data[d];
};

template <std::size_t d>
struct Tensor<0u,d>
{
  double Data;
};

We can use copy-list-initialization to initialize such a Tensor this way:

Tensor<2u,3u> t= { 1.0, 2.0, 3.0,
                   4.0, 5.0, 6.0,
                   7.0, 8.0, 9.0 };

Note the brace elision.

This also works in a generic programming context, e.g.:

template <typename... T,
  typename= std::enable_if_t<(std::is_same_v<T,double> && ...) && (sizeof...(T)==9u)>>
Tensor<2u,3u> MakeTensor(T... x) { return {x...}; }

However, if Data were private, Tensor would no longer be an aggregate and thus, the above syntax wouldn't be valid.

In that case, could this behaviour be recovered back programmatically?


Solution

  • Aggregate initialization will only work for aggregates, so no, strictly speaking, that's not possible.

    You could emulate it either by providing an initializer_list/variadic template constructor or via a constructor taking an aggregate version of your tensor data structure only for initialization purposes, something like:

    template <std::size_t r,std::size_t d>
    struct RawTensor
    {
      std::array<RawTensor<r-1u,d>,d> data;
    };
    
    template <std::size_t d>
    struct RawTensor<0u,d>
    {
      double data;
    };
    
    template <std::size_t r,std::size_t d>
    struct Tensor
    {
      Tensor( RawTensor<r,d> const& data ): data_{data}{}
    
    private:
      RawTensor<r,d> data_;
    };
    
    template <typename... T,
    typename= std::enable_if_t<(std::is_same_v<T,double> && ...) && (sizeof...(T)==9u)>>
    RawTensor<2u,3u> MakeTensor(T... x) { return {x...}; }
    
    Tensor<2u,3u> t1 = RawTensor<2u,3u>{ 1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0 };
    Tensor<2u,3u> t2 = MakeTensor( 1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0 );
    

    and letting the compiler optimize it. If your original intent was to exploit chaining of operator[] (and hence you need nested data to be of Tensor type), it's still possible but will require a less trivial initialization logic.