Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-31340

No call to destroy() for filter in SparkHistory

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 2.4.5
    • None
    • Spark Core

    Description

      Adding  UI filter AuthenticationFilter (from Hadoop) causes Spark application to never end, due to threads created in this class not interrupted.

      To reproduce

      Start a local spark context with hadoop-auth 3.1.0
      spark.ui.enabled=true
      spark.ui.filters=org.apache.hadoop.security.authentication.server.AuthenticationFilter
      #and all required ldap props
      spark.org.apache.hadoop.security.authentication.server.AuthenticationFilter.param.ldap.*=...

      What's happening :

      In AuthenticationFilter's init we have the following chain:

      (line.178) initializeSecretProvider(filterConfig);
      (line.209) secretProvider = constructSecretProvider(...)
      (line 237) provider.init(config, ctx, validity);

      If no config is specified provider will be RolloverSignerSecretProvider which will (line 95) start a new thread via
      scheduler = Executors.newSingleThreadScheduledExecutor();

      The created thread will be stopped in destroy() method (line 106).

      Unfortunately, this destroy() method is not called when SparkHistory is closed, leaving threads running.

       

      This ticket is not here to address the particular case of Hadoop's authentication filter, but to ensure that any Filter added in spark.ui will have its destroy() method called.

       

       

       

      Attachments

        Activity

          People

            Unassigned Unassigned
            taccart thierry accart
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: