Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-35027

Close the inputStream in FileAppender when writing the logs failure

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 3.1.1
    • 3.2.0, 3.1.3, 3.0.4
    • Spark Core
    • None

    Description

      In Spark Cluster, the ExecutorRunner uses FileAppender  to redirect the stdout/stderr of executors to file, when the writing processing is failure due to some reasons: disk full, the FileAppender will only close the input stream to file, but leave the pipe's stdout/stderr open, following writting operation in executor side may be hung. 

      need to close the inputStream in FileAppender ?

      Attachments

        Activity

          People

            jhu Jack Hu
            jhu Jack Hu
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: