Search code examples
pythonmatrixcudapycudacublas

cuBLAS Dgemm product with python


I have 2 simple matrices A and B and I'm calculating their multiplication. The arrays looks like this (using numpy as as mockup)

A=np.array(([1,2,3],[4,5,6])).astype(np.float64)
B=np.array(([7,8],[9,10],[11,12])).astype(np.float64)

Here are the shapes of the Matrix

A: (2, 3)

B: (3, 2)

Now, I am trying to do this using cublasDgemmBatched to get the product.

I am confused on what my m,n,and k values should be when applying cublasDgemmBatched. Also, I'm not sure what my leading dimension (lda, ldb, ldc) of the array would be.

There is a nice 3d example here but I can't seem to get this function to work on 2d matrices.

Ideally, i would like to get the same results as np.dot.


Solution

  • I don't have skcuda.blas to confirm this. But a more complete example might look like

    A = np.array(([1, 2, 3], [4, 5, 6])).astype(np.float64)
    B = np.array(([7, 8], [9, 10], [11, 12])).astype(np.float64)
    
    m, k = A.shape
    k, n = B.shape
    
    a_gpu = gpuarray.to_gpu(A)
    b_gpu = gpuarray.to_gpu(B)
    c_gpu = gpuarray.empty((m, n), np.float64)
    
    alpha = np.float64(1.0)
    beta = np.float64(0.0)
    
    a_arr = bptrs(a_gpu)
    b_arr = bptrs(b_gpu)
    c_arr = bptrs(c_gpu)
    
    cublas_handle = cublas.cublasCreate()
    
    cublas.cublasDgemm(cublas_handle, 'n','n',
                       n, m, k, alpha,
                       b_arr.gpudata, m,
                       a_arr.gpudata, k,
                       beta, c_arr.gpudata, m)