Search code examples
matlabmatrixmachine-learningparallel-processinggradient-descent

batch processing of multiplication of a column vector and a row vector in mini-batch learning


In Matlab, I want to multiply a column vector A with row vector B. The obtained result is a matrix. Now suppose I want to multiple A1, A2, ..., An and B1, B2, ..., Bn in batch. That is, I want to take advantage o the parallel processing capability of Matlab on matrix by somehow forming this problem into a matrix multiplication problem. By the way, parfor does not work in my case.

The reason I want to do this is that I want to implement mini-batch learning in Matlab. Suppose there are n training examples in a mini-batch. When I try to backpropagate the error, my errors for all cases in a minibatch are n row vectors B1, B2, ..., Bn. And the gradient corresponding to all n cases are n column vector A1, A2, ..., An. I want to multiply them to obtain the incremental weights for all n cases.


Solution

  • Say you have a matrix A (resp. B) in which the columns are your n vectors A1,A2,...,An (resp. B1,B2,...,Bn ),

    Your program would output n matrices. In order to vectorize this, you have to increase by 1 the dimensions of your matrices ( In this case 2-->3 dimensions matrices). The i-th "slice" of them at constant z will be respectively your vectors Ai and Bi. Then, you can use bsxfun with the @times function handle :

    n=size(A,2);
    rA=reshape(A,[],1,n); % Flip A to 1st and third dimensions
    rB=reshape(B,1,[],n);
    
    % use bsxfun to compute the products
    Out=bsxfun(@times, rA,rB);
    
    % Now Out is a 3-d matrix where slices at constant z
    % are the output matrices you want
    
    % The trick here is that matrix multiplication
    % of a column vector Ai with a row vector Bi is equal to elementwise
    % multiplication of matrix [Ai Ai ... Ai] with matrix [Bi;Bi;...;Bi], 
     % and that's what the call to bsxfun does 
     % (see the part about "singleton expansion")