Search code examples
opencvarmadilloeigen3arrayfirelibtorch

Data transfer between LibTorch C++ and Eigen


Data transfer between LibTorch C++ and Eigen (Questions and Help)

Hello all, I'm developing a Data Transfer Tools for C++ Linear Algebra Libraries, as you can see here: https://github.com/andrewssobral/dtt (considering bi-dimensional arrays or matrices) and I'm wondering if you can help me on the following code for data transfer between LibTorch and Eigen:

std::cout << "Testing LibTorch to Eigen:" << std::endl;
// LibTorch
torch::Device device(torch::cuda::is_available() ? torch::kCUDA : torch::kCPU);
torch::Tensor T = torch::rand({3, 3});
std::cout << "LibTorch:" << std::endl;
std::cout << T << std::endl;
// Eigen
float* data = T.data_ptr<float>();
Eigen::Map<Eigen::MatrixXf> E(data, T.size(0), T.size(1));
std::cout << "EigenMat:\n" << E << std::endl;
// re-check after changes
E(0,0) = 0;
std::cout << "EigenMat:\n" << E << std::endl;
std::cout << "LibTorch:" << std::endl;
std::cout << T << std::endl;

This is the output of the code:

--------------------------------------------------
Testing LibTorch to Eigen:

LibTorch:
 0.6232  0.5574  0.6925
 0.7996  0.9860  0.1471
 0.4431  0.5914  0.8361
[ Variable[CPUFloatType]{3,3} ]

EigenMat (after data transfer):
0.6232 0.7996 0.4431
0.5574  0.986 0.5914
0.6925 0.1471 0.8361

# Modifying EigenMat, set element at (0,0) = 0
EigenMat:
     0 0.7996 0.4431
0.5574  0.986 0.5914
0.6925 0.1471 0.8361

# Now, the LibTorch matrix was also modified (OK), but the rows and columns were switched.
LibTorch:
 0.0000  0.5574  0.6925
 0.7996  0.9860  0.1471
 0.4431  0.5914  0.8361
[ Variable[CPUFloatType]{3,3} ]

Do someone knows what's happening ? There's a better way to do that?

I need also to do the same for Armadillo, ArrayFire and OpenCV (cv::Mat). Thanks in advance!


Solution

  • The reason for the switched rows and columns is that LibTorch (apparently) uses row-major storage, while Eigen by default uses column-major storage. I don't know if you can change the behavior of LibTorch, but with Eigen you can also use row-major storage, like so:

    typedef Eigen::Matrix<float, Eigen::Dynamic, Eigen::Dynamic, Eigen::RowMajor> MatrixXf_rm; // same as MatrixXf, but with row-major memory layout
    

    and then use it like this:

    Eigen::Map<MatrixXf_rm> E(data, T.size(0), T.size(1));