I am implementing a GUI for a python programme using Qt5 and PySide2. I have no problem understanding the C++ side of Qt, so feel free to point out Qt references not related with python.
I have some data that I display in a QTableView
using a subclass of QAbstractTableModel
. I also use a subclass of QSortFilterProxyModel
to filter my table to only display a subset of the underlying data because it's a very large dataset. I offer the user the possibility to display only part of this data according to some criteria. This all works really well.
Then I have configured the QTableView
so that the user can only select complete rows:
self.ui.candidatesTable.setSelectionBehavior(QTableView.SelectRows)
And in the object handling the UI I have implemented a slot that is called when the selection in the table changes:
@Slot(QItemSelection)
def handleSelectionChanged(self, item):
hasSelection = self.ui.candidatesTable.selectionModel().hasSelection()
if hasSelection:
selectedRows = self.ui.candidatesTable.selectionModel().selectedRows()
for row in selectedRows:
print(row.row())
My problem is that the value printed by print(row.row())
shows the row index in the currently displayed rows. If the user has selected filtering criteria that only displays 5 rows out of the several thousands, and then selects the first row, print(row.row())
will return 0
and not the original index in the underlying QAbstractTableModel
.
My question is therefore the following: how can I access the original index in this situation?
You have to map the QModelIndex of the proxy model to the source model using the mapToSource() method:
@Slot(QItemSelection)
def handleSelectionChanged(self, item):
indexes = self.ui.candidatesTable.selectedIndexes()
proxy_model = self.ui.candidatesTable.model()
rows = set()
for index in indexes:
si = proxy_model.mapToSource(index)
rows.add(si.row())
for row in rows:
print(row)