Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Not A Problem
-
3.8.1
-
None
-
None
-
windows
Description
The deadlock occurs while destroying an amq connection, a broker connection as been establish.
I added a break point inside before termination decaf::internal::util::concurrent::PlatformThread::interruptibleWaitOnCondition, and before the deadlock occurs i can see
the following threads:
0 4396 Main Thread PorterSvc decaf::internal::util::concurrent::Threading::join Normal 0 0 8252 Worker Thread Win32 Thread 776ef959 Normal 0 0 5280 Worker Thread Win32 Thread 776f015d Normal 0 0 8588 Worker Thread _threadstartex CtiLocalConnect<CtiOutMessage,INMESS>::CtiLocalConnectRead Normal 0 0 4916 Worker Thread _threadstartex _CrtDefaultAllocHook Normal 0 0 > 5532 Worker Thread _threadstartex decaf::internal::util::concurrent::PlatformThread::interruptibleWaitOnCondition Normal 0 0 1956 Worker Thread _threadstartex std::_Lockit::~_Lockit Normal 0 0 5784 Worker Thread Win32 Thread CtrlHandler Highest 0 0 4512 Worker Thread Win32 Thread 776f1f46 Normal 0 0 7180 Worker Thread Win32 Thread 776f1f46 Normal 0 0 1668 Worker Thread Win32 Thread 776f1f46 Normal 0
At this moment, the call stack for thread 5532 is the following:
decaf::internal::util::concurrent::PlatformThread::interruptibleWaitOnCondition(void * condition=0x000007d8, _RTL_CRITICAL_SECTION * mutex=0x077dee10, __int64 mills=7659, int nanos=0, decaf::internal::util::concurrent::CompletionCondition & complete={...}) Line 305 + 0xe bytes C++ `anonymous namespace'::doWaitOnMonitor(decaf::internal::util::concurrent::MonitorHandle * monitor=0x0771acf8, decaf::internal::util::concurrent::ThreadHandle * thread=0x05f338a8, __int64 mills=7659, int nanos=0, bool interruptible=true) Line 752 + 0x23 bytes C++ decaf::internal::util::concurrent::Threading::waitOnMonitor(decaf::internal::util::concurrent::MonitorHandle * monitor=0x0771acf8, __int64 mills=7659, int nanos=0) Line 1558 + 0x1b bytes C++ decaf::util::concurrent::Mutex::wait(__int64 millisecs=7659, int nanos=0) Line 180 + 0x1a bytes C++ decaf::util::concurrent::Mutex::wait(__int64 millisecs=7659) Line 162 C++ decaf::internal::util::concurrent::SynchronizableImpl::wait(__int64 millisecs=7659) Line 54 C++ decaf::util::TimerImpl::run() Line 102 + 0x1f bytes C++ `anonymous namespace'::runCallback(void * arg=0x05f338a8) Line 266 + 0x11 bytes C++ `anonymous namespace'::threadEntryMethod(void * arg=0x05f338a8) Line 254 + 0x15 bytes C++
Afterwards, only the main thread is left and the deadlock occurs interruptibleWaitOnCondition at "PlatformThread::unlockMutex(mutex);"
0 > 4396 Main Thread PorterSvc decaf::internal::util::concurrent::PlatformThread::interruptibleWaitOnCondition Normal 0
decaf::internal::util::concurrent::PlatformThread::interruptibleWaitOnCondition(void * condition=0x001293d0, _RTL_CRITICAL_SECTION * mutex=0x1f13f1e8, __int64 mills=600000, int nanos=0, decaf::internal::util::concurrent::CompletionCondition & complete={...}) Line 305 + 0xe bytes C++ decaf::internal::util::concurrent::Threading::join(decaf::internal::util::concurrent::ThreadHandle * thread=0x043eb370, __int64 mills=600000, int nanos=0) Line 1168 + 0x23 bytes C++ decaf::lang::Thread::join(__int64 millisecs=600000) Line 178 + 0x19 bytes C++ decaf::util::Timer::awaitTermination(__int64 timeout=10, const decaf::util::concurrent::TimeUnit & unit={...}) Line 236 C++ decaf::util::concurrent::ExecutorKernel::~ExecutorKernel() Line 419 C++ decaf::util::concurrent::ExecutorKernel::`scalar deleting destructor'() + 0xf bytes C++ decaf::util::concurrent::ThreadPoolExecutor::~ThreadPoolExecutor() Line 1481 + 0x1f bytes C++ decaf::util::concurrent::ThreadPoolExecutor::`vector deleting destructor'() + 0x4d bytes C++ decaf::lang::Pointer<decaf::util::concurrent::ExecutorService,decaf::util::concurrent::atomic::AtomicRefCounter>::onDeleteFunc(decaf::util::concurrent::ExecutorService * value=0x1f13b7f8) Line 317 + 0x20 bytes C++ decaf::lang::Pointer<decaf::util::concurrent::ExecutorService,decaf::util::concurrent::atomic::AtomicRefCounter>::~Pointer<decaf::util::concurrent::ExecutorService,decaf::util::concurrent::atomic::AtomicRefCounter>() Line 148 + 0xf bytes C++ activemq::core::ConnectionConfig::~ConnectionConfig() Line 303 + 0x1ba bytes C++ activemq::core::ConnectionConfig::`scalar deleting destructor'() + 0xf bytes C++ activemq::core::ActiveMQConnection::~ActiveMQConnection() Line 501 + 0x1f bytes C++ activemq::core::ActiveMQConnection::`vbase destructor'() + 0xf bytes C++ activemq::core::ActiveMQConnection::`vector deleting destructor'() + 0x4d bytes C++
If i look into the ThreadHandle object, i can see the threadId 5532 and state == 4 (Thread::TIMED_WAITING), however the thread at this point is not running
- thread 0x06054520 {parent=0x060543c0 handle=0x000007bc mutex=0x062ad498 ...} decaf::internal::util::concurrent::ThreadHandle * + parent 0x060543c0 {heap={...} cancelled=true } decaf::lang::Thread * handle 0x000007bc void * + mutex 0x062ad498 {DebugInfo=0x00373658 LockCount=-2 RecursionCount=1 ...} _RTL_CRITICAL_SECTION * condition 0x000007b8 void * state 4 volatile int references 2 volatile int priority 5 int interrupted false bool interruptible true bool timerSet true bool canceled false bool unparked false bool parked false bool sleeping false bool waiting false bool notified true bool blocked true bool suspended false bool + name 0x062ad580 "Thread-2" char * stackSize 32768 __int64 + tls 0x06054558 void * [384] threadMain 0x00ecbf7c `anonymous namespace'::runCallback(void *) void (void *)* threadArg 0x06054520 void * threadId 5532 __int64 osThread false bool + interruptingThread 0x00000000 {parent=??? handle=??? mutex=??? ...} decaf::internal::util::concurrent::ThreadHandle * numAttached 0 int + next 0x00000000 {parent=??? handle=??? mutex=??? ...} decaf::internal::util::concurrent::ThreadHandle * + joiners 0x060e65c0 {parent=0x077a1d90 handle=0xfffffffe mutex=0x05fcf318 ...} decaf::internal::util::concurrent::ThreadHandle * + monitor 0x052e47e8 {name=0xcdcdcdcd <Bad Ptr> mutex=0x06bfed80 lock=0x06bff148 ...} decaf::internal::util::concurrent::MonitorHandle *
I'm not sure if this is a race condition or if the flags are not updated properly.