Hive
  1. Hive
  2. HIVE-2142

Jobs do not get killed even when they created too many files.

    Details

    • Type: Bug Bug
    • Status: Closed
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 0.8.0
    • Component/s: None
    • Labels:
      None
    • Hadoop Flags:
      Reviewed
    1. HIVE-2142.1.patch
      1 kB
      He Yongqiang
    2. HIVE-2142.2.patch
      1 kB
      He Yongqiang

      Activity

      Transition Time In Source Status Execution Times Last Executer Last Execution Date
      Open Open Patch Available Patch Available
      53m 27s 1 He Yongqiang 02/May/11 23:49
      Patch Available Patch Available Resolved Resolved
      6h 21m 1 Ning Zhang 03/May/11 06:11
      Resolved Resolved Closed Closed
      227d 18h 45m 1 Carl Steinbach 16/Dec/11 23:56
      Carl Steinbach made changes -
      Status Resolved [ 5 ] Closed [ 6 ]
      Ning Zhang made changes -
      Hadoop Flags [Reviewed]
      Status Patch Available [ 10002 ] Resolved [ 5 ]
      Fix Version/s 0.8.0 [ 12316178 ]
      Resolution Fixed [ 1 ]
      Hide
      Ning Zhang added a comment -

      Committed. Thanks Yongqiang!

      Show
      Ning Zhang added a comment - Committed. Thanks Yongqiang!
      Hide
      Ning Zhang added a comment -

      +1

      Show
      Ning Zhang added a comment - +1
      Hide
      He Yongqiang added a comment -

      opened follow up jira HIVE-2143 for testing.

      Show
      He Yongqiang added a comment - opened follow up jira HIVE-2143 for testing.
      He Yongqiang made changes -
      Attachment HIVE-2142.2.patch [ 12478003 ]
      Hide
      Ning Zhang added a comment -

      Please correct the error message syntax as follows (since it is a user-facing message):

      "total number of created files now is " + numFiles + ", exceeds ").append(upperLimit);

      should be changed to:

      "total number of created files now is " + numFiles + ", which exceeds ").append(upperLimit);

      Otherwise it looks fine.

      Show
      Ning Zhang added a comment - Please correct the error message syntax as follows (since it is a user-facing message): "total number of created files now is " + numFiles + ", exceeds ").append(upperLimit); should be changed to: "total number of created files now is " + numFiles + ", which exceeds ").append(upperLimit); Otherwise it looks fine.
      He Yongqiang made changes -
      Summary HadoopJobExecHelper bug that cause jobs created too many files do not get killed. Jobs do not get killed even when they created too many files.
      He Yongqiang made changes -
      Status Open [ 1 ] Patch Available [ 10002 ]
      He Yongqiang made changes -
      Field Original Value New Value
      Attachment HIVE-2142.1.patch [ 12477998 ]
      He Yongqiang created issue -

        People

        • Assignee:
          He Yongqiang
          Reporter:
          He Yongqiang
        • Votes:
          0 Vote for this issue
          Watchers:
          0 Start watching this issue

          Dates

          • Created:
            Updated:
            Resolved:

            Development