I have used abc in the past, and I'd like to use them again, to enforce pure virtual like methods with @abstractmethod. This is in the context of a Python front-end to an API which users will extend frequently.
It's a bit too complicated for me to develop a reliably comprehensive test of scale, and I've always used abc as a black magic closed box, so I don't know where the cost of the abstraction and checks for abstracts is, and when it's likely incurred, or what the cost would actually be like or what it'd scale with.
I couldn't find satisfactorily complete information of the underlying mechanics anywhere, so any pointers to when and where the magic happens and at what cost would be immensely appreciated (Import? Instancing? Double dipping cost if the instance is extended?)
Some further info about the use case: Unlike in previous use cases (for me), where there was a very limit number of instances of each base object and abc measured to no perceivable overhead, this time around it would be for something (nodes in a DAG with a tree view) which can be instanced and then extended in place hundreds of times, and the number of virtual methods is likely to go up to somewhere around a dozen per class.
Inheritance is never multiple, and it's generally quite shallow, at the most two or three deep, the majority of the time just one.
Python 2.7 due to 3rd party platforms constraints.
Prior to Python 2.6, using ABC came with some significant overhead. Issue 1762 reported this as a bug, and it was fixed for Python 2.6 (by moving some of the ABC machinery into the C implementation of object
).
In newer versions of Python, there should be very little difference in performance between ABC-using and non-ABC using classes (the bug mentions a very small remaining difference in the speed of isinstance
checks, but other actions having essentially zero difference in performance).