Search code examples
pythonshutdowngil

PyGILState_Ensure after Py_Finalize?


I am seeing crash at python shutdown from a complex c++-based extension module (https://github.com/woodem/woo). There are some held shared_ptr objects which are being destroyed without GIL being taken by the deleter automatically (which is a known bug in boost::python). I added PyGILState_Ensure / Release (using the GilLock RAII wrapper) around those calls, so that these object are destroyed properly. However, I am seeing this now:

    #0  0x00007ffff7bca1d4 in sem_wait@@GLIBC_2.2.5 () from /lib/x86_64-linux-gnu/libpthread.so.0
    #1  0x00000000005474c5 in PyThread_acquire_lock ()
    #2  0x0000000000539fb4 in ?? ()
    #3  0x000000000047b5ef in ?? ()
    #4  0x00007ffff18b8475 in GilLock::GilLock (this=0x7fffffffde10) at /home/eudoxos/build/woo/build-mt/dbg/include/woo/lib/pyutil/gil.hpp:9
    #5  0x00007ffff18b8869 in woo::AttrTraitBase::~AttrTraitBase (this=0x7ffff50b09c0 <GlFieldDispatcher::GlFieldDispatcher_getTrait_functors()::_tmp>, __in_chrg=<optimized out>) at /home/eudoxos/build/woo/build-mt/dbg/include/woo/lib/object/AttrTrait.hpp:43
    [...]

and I wonder if the reason could be that I am trying to get the GIL at the point after Py_Finalize will have been called already.

Is there a way to test whether Py_Finalize was in effect? COuld not find anything in the API.

EDIT: for completeness, this is the backtrace without holding GIL first (see https://github.com/boostorg/python/pull/11 for details):

#0  0x000000000059814b in ?? ()
#1  0x00007ffff186e143 in boost::python::api::object_base::~object_base (this=0xe1e460, __in_chrg=<optimized out>) at /usr/include/boost/python/object_core.hpp:526
#2  0x00007ffff186e074 in boost::python::api::object::~object (this=0xe1e460, __in_chrg=<optimized out>) at /usr/include/boost/python/object_core.hpp:318
#3  0x00007ffff18b6e42 in boost::python::detail::tuple_base::~tuple_base (this=0xe1e460, __in_chrg=<optimized out>) at /usr/include/boost/python/tuple.hpp:19
#4  0x00007ffff18b6e9c in boost::python::tuple::~tuple (this=0xe1e460, __in_chrg=<optimized out>) at /usr/include/boost/python/tuple.hpp:32
#5  0x00007ffff2f09560 in woo::Plot::~Plot (this=0xe1e410, __in_chrg=<optimized out>) at /home/eudoxos/woo/core/Plot.cpp:9
#6  0x00007ffff30f146c in boost::detail::sp_ms_deleter<woo::Plot>::destroy (this=0xe1e408) at /usr/include/boost/smart_ptr/make_shared_object.hpp:57
#7  0x00007ffff3291ee6 in boost::detail::sp_ms_deleter<woo::Plot>::operator() (this=0xe1e408) at /usr/include/boost/smart_ptr/make_shared_object.hpp:87
#8  0x00007ffff3238385 in boost::detail::sp_counted_impl_pd<woo::Plot*, boost::detail::sp_ms_deleter<woo::Plot> >::dispose (this=0xe1e3f0) at /usr/include/boost/smart_ptr/detail/sp_counted_impl.hpp:153
#9  0x00007ffff186de5c in boost::detail::sp_counted_base::release (this=0xe1e3f0) at /usr/include/boost/smart_ptr/detail/sp_counted_base_gcc_x86.hpp:146
#10 0x00007ffff186df1f in boost::detail::shared_count::~shared_count (this=0xe1dfb8, __in_chrg=<optimized out>) at /usr/include/boost/smart_ptr/detail/shared_count.hpp:371
#11 0x00007ffff2fb494c in boost::shared_ptr<woo::Plot>::~shared_ptr (this=0xe1dfb0, __in_chrg=<optimized out>) at /usr/include/boost/smart_ptr/shared_ptr.hpp:328
#12 0x00007ffff2fd4e8c in woo::AttrTrait<4>& woo::AttrTrait<4>::ini<boost::shared_ptr<woo::Plot> >(boost::shared_ptr<woo::Plot>)::{lambda()#1}::~ini() (this=0xe1dfb0, __in_chrg=<optimized out>) at /home/eudoxos/build/woo/build-mt/dbg/include/woo/lib/object/AttrTrait.hpp:264
#13 0x00007ffff309a7b0 in std::_Function_base::_Base_manager<woo::AttrTrait<4>& woo::AttrTrait<4>::ini<boost::shared_ptr<woo::Plot> >(boost::shared_ptr<woo::Plot>)::{lambda()#1}>::_M_destroy(std::_Any_data&, std::integral_constant<bool, false>) (__victim=...) at /usr/include/c++/4.9/functional:1894
#14 0x00007ffff3061bde in std::_Function_base::_Base_manager<woo::AttrTrait<4>& woo::AttrTrait<4>::ini<boost::shared_ptr<woo::Plot> >(boost::shared_ptr<woo::Plot>)::{lambda()#1}>::_M_manager(std::_Any_data&, std::_Function_base::_Base_manager<woo::AttrTrait<4>& woo::AttrTrait<4>::ini<boost::shared_ptr<woo::Plot> >(boost::shared_ptr<woo::Plot>)::{lambda()#1}> const&, std::_Manager_operation) (__dest=..., __source=..., __op=std::__destroy_functor) at /usr/include/c++/4.9/functional:1918
#15 0x00007ffff18b5877 in std::_Function_base::~_Function_base (this=0x7fffffffdd90, __in_chrg=<optimized out>) at /usr/include/c++/4.9/functional:1998
#16 0x00007ffff18b85bc in std::function<boost::python::api::object ()>::~function() (this=0x7fffffffdd90, __in_chrg=<optimized out>) at /usr/include/c++/4.9/functional:2142
#17 0x00007ffff18bfc28 in std::function<boost::python::api::object ()>::operator=(std::function<boost::python::api::object ()> const&) (this=0x7ffff50da900 <Scene::Scene_getTrait_plot()::_tmp+256>, __x=...) at /usr/include/c++/4.9/functional:2243
#18 0x00007ffff18b8559 in woo::AttrTraitBase::_resetInternalCallables (this=0x7ffff50da800 <Scene::Scene_getTrait_plot()::_tmp>) at /home/eudoxos/build/woo/build-mt/dbg/include/woo/lib/object/AttrTrait.hpp:36
#19 0x00007ffff18b8854 in woo::AttrTraitBase::~AttrTraitBase (this=0x7ffff50da800 <Scene::Scene_getTrait_plot()::_tmp>, __in_chrg=<optimized out>) at /home/eudoxos/build/woo/build-mt/dbg/include/woo/lib/object/AttrTrait.hpp:45
#20 0x00007ffff19a38e2 in woo::AttrTrait<4>::~AttrTrait (this=0x7ffff50da800 <Scene::Scene_getTrait_plot()::_tmp>, __in_chrg=<optimized out>) at /home/eudoxos/build/woo/build-mt/dbg/include/woo/lib/object/AttrTrait.hpp:186

Solution

  • API for interpreter initialization/finalization includes Py_IsInitialized; returns non-zero between Py_Initialize and Py_Finalize, and zero before Py_Initialize and after Py_Finalize.

    You'd have to test whether race conditions could mess you up here; it's wholly possible you could attempt to acquire the GIL, another thread calls Py_Finalize, and Py_Finalize blows away the lock from under you. There are some notes about PyGILState_* APIs not handling the existence of multiple interpreters properly that may or may not apply to your scenario (or hint at a similar issue that might lead to the speculative race I mentioned).