Uploaded image for project: 'Traffic Server'
  1. Traffic Server
  2. TS-3265

core dump in spdy_prepare_status_response_and_clean_request

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 5.2.0
    • 5.2.0
    • SPDY
    • None

    Description

      Noticed the below core dump in v5.2.0.

      (gdb) bt
      #0  0x0000000000638442 in std::vector<std::pair<std::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::allocator<std::pair<std::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::basic_string<char, std::char_traits<char>, std::allocator<char> > > > >::size (
          this=0x38) at /usr/lib/gcc/x86_64-redhat-linux/4.4.7/../../../../include/c++/4.4.7/bits/stl_vector.h:533
      #1  0x000000000063667e in spdy_prepare_status_response_and_clean_request (sm=0x2ad800d4d730, stream_id=1, status=0x7b6b4d "500 Internal Server Error") at SpdyCallbacks.cc:57
      #2  0x000000000063373c in spdy_process_fetch (event=TS_EVENT_ERROR, sm=0x2ad800d4d730, edata=0x2ad818105c50) at SpdyClientSession.cc:350
      #3  0x0000000000633151 in SpdyClientSession::state_session_readwrite (this=0x2ad800d4d730, event=3, edata=0x2ad818105c50) at SpdyClientSession.cc:253
      #4  0x0000000000502eae in Continuation::handleEvent (this=0x2ad800d4d730, event=3, data=0x2ad818105c50) at ../iocore/eventsystem/I_Continuation.h:146
      #5  0x0000000000501116 in FetchSM::InvokePluginExt (this=0x2ad818105c50, fetch_event=3) at FetchSM.cc:245
      #6  0x0000000000501e59 in FetchSM::process_fetch_write (this=0x2ad818105c50, event=3) at FetchSM.cc:495
      #7  0x0000000000501f3e in FetchSM::fetch_handler (this=0x2ad818105c50, event=3, edata=0x2ad86c90b960) at FetchSM.cc:511
      #8  0x0000000000502eae in Continuation::handleEvent (this=0x2ad818105c50, event=3, data=0x2ad86c90b960) at ../iocore/eventsystem/I_Continuation.h:146
      #9  0x000000000053e9d5 in PluginVC::process_write_side (this=0x2ad86c90b810, other_side_call=false) at PluginVC.cc:519
      #10 0x000000000053dbcd in PluginVC::main_handler (this=0x2ad86c90b810, event=2, data=0x2ad80802c690) at PluginVC.cc:210
      #11 0x0000000000502eae in Continuation::handleEvent (this=0x2ad86c90b810, event=2, data=0x2ad80802c690) at ../iocore/eventsystem/I_Continuation.h:146
      #12 0x0000000000773136 in EThread::process_event (this=0x2ad6cc000010, e=0x2ad80802c690, calling_code=2) at UnixEThread.cc:144
      #13 0x0000000000773451 in EThread::execute (this=0x2ad6cc000010) at UnixEThread.cc:223
      #14 0x00000000007726f1 in spawn_thread_internal (a=0x161fd50) at Thread.cc:88
      #15 0x00002ad6c6d30851 in start_thread () from /lib64/libpthread.so.0
      #16 0x0000003296ee890d in clone () from /lib64/libc.so.6
      

      Looking at the core in gdb, it seems the req object extracted from the sm->req_map based on stream_id is null. However, the req object (extracted from TSFetchUserDataGet) from the caller spdy_process_fetch in frame#2 seems valid. Unfortunately, the core was deleted before I could examine more info (like, the sm object's req_map or the req object details etc).

      (gdb) frame 1
      #1  0x000000000063667e in spdy_prepare_status_response_and_clean_request (sm=0x2ad800d4d730, stream_id=1, status=0x7b6b4d "500 Internal Server Error") at SpdyCallbacks.cc:57
      57	SpdyCallbacks.cc: No such file or directory.
      	in SpdyCallbacks.cc
      (gdb) print req
      $1 = (SpdyRequest *) 0x0
      (gdb) up
      #2  0x000000000063373c in spdy_process_fetch (event=TS_EVENT_ERROR, sm=0x2ad800d4d730, edata=0x2ad818105c50) at SpdyClientSession.cc:350
      350	SpdyClientSession.cc: No such file or directory.
      	in SpdyClientSession.cc
      (gdb) print req
      $2 = (SpdyRequest *) 0x2ad7c4e8be90
      

      Attachments

        Activity

          People

            sudheerv Sudheer Vinukonda
            sudheerv Sudheer Vinukonda
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: