Search code examples
arraysmatlabmatrixvectorizationindices

Creating a Binary Matrix from an Array of Indices


Definition of Problem

I have two arrays called weights and indices respectively:

 weights = [1, 3, 2]; 
 indices = [1, 1, 2, 3, 2, 4]; 

 m = 4; % Number of Rows in Matrix
 n = 3; % Number of Columns in Matrix
 M = zeros(m, n);

The array called indices is storing the indices where I need to store a 1 in every column.

For instance for the first column at row 1 which is indicated in indices(1) I need to store a 1 and this is indicated by weights(1) which is equal to 1.

  M(indices(1), 1) = 1;

For column 2, at rows 1 to 3 (indices(2:4)) I need to store a 1. The range of indices for column 2 are again indicated by weights(2).

  M(indices(2:4),2) = 1;

Similarly, for column 3, at rows 2 and 4 (indices(5:6)) I need to store a 1. The range of indices for column 3 are again indicated by weights(3).

  M(indices(5:6),3) = 1; 

Expected Binary Matrix

The expected and resulting binary matrix is:

 1 1 0
 0 1 1
 0 1 0
 0 0 1

Solution

Is there any way I can do this in generalized manner by using both the weights and indices arrays rather than doing it in a hard coded manner to create create the binary matrix M?


Solution

  • You just have a weird way of describing your indices, so you just need to convert them to something standard.

    columsn_idx=repelem(1:n,weights);       % repeat column number as much as needed
    row_idx=indices ;                       % just for clarity
    M(sub2ind([m,n],row_idx,columsn_idx))=1;% Use linear indices