Search code examples
python-3.xdictionarysparse-matrix

Returning 0 for a value that doesn't exist


I'm trying to build a sparse matrix, and let's say I'm trying to initialize the class like this:

m = Sparse_Matrix(3,3, (0,0,0),(0,1,10),(0,2,11),(1,0,9),(1,2,8),(2,0,7),(2,1,8))

And if I call m.row(0), then that would return all the values in that row, so:

m.row(0) -> (0,10,11)

However, given the sparse matrix doesn't have any value for (1,1), and I'm supposed to return 0 for that index, how would I go about this? So for example, if I call:

m.row(1) -> (9,0,8)

Here is my code

def row(self, value:int):
    row_list = []
    for key in self.list_of_tuples:
        print(key)
        if value == key[0]:
            row_list.append(self.matrix.get((key[0], key[1]),0))
            print(row_list)
    return tuple(row_list)

In my code, self.matrix is a dictionary whose keys are tuples of the indices, and whose values are the value that corresponds to that key. So for example, I would have

{(0,1):10, (0,2):11, (1,0):9, (1,2):8, (2,0):7, (2,1):8}

If the value for a particular index is 0, then I don't add it to the dictionary.

EDIT: I am not allowed to use scipy.sparse for this.


Solution

  • Since you didn't post the rest of your class, I don't know what the internal implementation details look like, but how about this:

    class SparseMatrix:
        def __init__(self, rows: int, cols: int, *entries):
            self.rows = rows
            self.cols = cols
            self.matrix = dict()
            for entry in entries:
                self.matrix[(entry[0], entry[1])] = entry[2]
    
        def row(self, row_num: int):
            return [self.matrix.get((row_num, i), 0) for i in range(self.cols)]
    

    Then:

    >>> m = SparseMatrix(3, 3, (0,0,0), (0,1,10), (0,2,11), (1,0,9), (1,2,8), (2,0,7), (2,1,8))
    >>> m.row(0)
    [0, 10, 11]
    >>> m.row(1)
    [9, 0, 8]