I am trying to get rid of the for loop and instead do an array-matrix multiplication to decrease the processing time when the weights
array is very large:
import numpy as np
sequence = [np.random.random(10), np.random.random(10), np.random.random(10)]
weights = np.array([[0.1,0.3,0.6],[0.5,0.2,0.3],[0.1,0.8,0.1]])
Cov_matrix = np.matrix(np.cov(sequence))
results = []
for w in weights:
result = np.matrix(w)*Cov_matrix*np.matrix(w).T
results.append(result.A)
Where:
Cov_matrix
is a 3x3
matrix
weights
is an array of n
lenght with n
1x3
matrices in it.
Is there a way to multiply/map weights
to Cov_matrix
and bypass the for loop? I am not very familiar with all the numpy functions.
I'd like to reiterate what's already been said in another answer: the np.matrix
class has much more disadvantages than advantages these days, and I suggest moving to the use of the np.array
class alone. Matrix multiplication of arrays can be easily written using the @
operator, so the notation is in most cases as elegant as for the matrix
class (and arrays don't have several restrictions that matrices do).
With that out of the way, what you need can be done in terms of a call to np.einsum
. We need to contract certain indices of three matrices while keeping one index alone in two matrices. That is, we want to perform w_{ij} * Cov_{jk} * w.T_{ki}
with a summation over j
, k
, giving us an array with i
indices. The following call to einsum
will do:
res = np.einsum('ij,jk,ik->i', weights, Cov_matrix, weights)
Note that the above will give you a single 1d array, whereas you originally had a list of arrays with shape (1,1)
. I suspect the above result will even make more sense. Also, note that I omitted the transpose in the second weights
argument, and this is why the corresponding summation indices appear as ik
rather than ki
. This should be marginally faster.
To prove that the above gives the same result:
In [8]: results # original
Out[8]: [array([[0.02803215]]), array([[0.02280609]]), array([[0.0318784]])]
In [9]: res # einsum
Out[9]: array([0.02803215, 0.02280609, 0.0318784 ])