Uploaded image for project: 'Oozie'
  1. Oozie
  2. OOZIE-3228

[Spark action] Can't load properties from spark-defaults.conf

VotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 5.0.0, 4.3.1
    • 5.1.0
    • action
    • None

    Description

      When I create a Oozie workflow to launch a spark action, the spark job can't load the configured properties in spark-defaults.conf. I've configured each NodeManager as the Spark gateway role, so spark-defaults.conf is generated in /etc/spark/conf/ on each worker node.

      In spark-defaults.conf some configuration I've set into.

      spark.executor.extraClassPath=/etc/hbase/conf:/etc/hive/conf
      spark.driver.extraClassPath=/etc/hbase/conf:/etc/hive/conf
      

      But in the Oozie spark job, they're not loaded automatically.

      --conf spark.executor.extraClassPath=$PWD/*
      --conf spark.driver.extraClassPath=$PWD/*
      

      Attachments

        1. OOZIE-3228.amend.001.patch
          5 kB
          Andras Piros
        2. OOZIE-3228.005.patch
          12 kB
          Andras Piros
        3. OOZIE-3228.004.patch
          10 kB
          Andras Piros
        4. OOZIE-3228.003.patch
          10 kB
          Andras Piros
        5. OOZIE-3228.002.patch
          9 kB
          Andras Piros
        6. OOZIE-3228.001.patch
          8 kB
          Andras Piros

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            andras.piros Andras Piros
            Tang Yan Tang Yan
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment