Uploaded image for project: 'Apache Ozone'
  1. Apache Ozone
  2. HDDS-2218

Use OZONE_CLASSPATH instead of HADOOP_CLASSPATH

    XMLWordPrintableJSON

Details

    • Task
    • Status: Resolved
    • Major
    • Resolution: Done
    • None
    • 1.1.0
    • docker

    Description

      HADOOP_CLASSPATH is the standard way to add additional jar files to the classpath of the mapreduce/spark/.. .jobs. If something is added to the HADOOP_CLASSPATH, than it should be on the classpath of the classic hadoop daemons.

      But for the Ozone components we don't need any new jar files (cloud connectors, libraries). I think it's more safe to separated HADOOP_CLASSPATH from OZONE_CLASSPATH. If something is really need on the classpath for Ozone daemons the dedicated environment variable should be used.

       

      Most probably it can be fixed in

      hadoop-hdds/common/src/main/bin/hadoop-functions.sh

      And the hadoop-ozone/dev/src/main/compose files also should be checked (some of them contain HADOOP_CLASSPATH

      Attachments

        Issue Links

          Activity

            People

              Sandeep Nemuri Sandeep Nemuri
              elek Marton Elek
              Votes:
              1 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: