Uploaded image for project: 'Sqoop (Retired)'
  1. Sqoop (Retired)
  2. SQOOP-3102

There are two questions in the sqoop_server_classpath_set function of the sqoop.sh. The sqoop2 start failure when using the same user with hadoop.

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Patch Available
    • Minor
    • Resolution: Unresolved
    • None
    • None
    • sqoop2-dist

    Description

      In start-all.sh, we have already default $HADOOP_HDFS_HOME and $HADOOP_YARN_HOME as $HADOOP_HOME and it works well.

      1. start hdfs daemons if hdfs is present
        if [ -f "${HADOOP_HDFS_HOME}"/sbin/start-dfs.sh ]; then
        "${HADOOP_HDFS_HOME}"/sbin/start-dfs.sh --config $HADOOP_CONF_DIR
        fi
      2. start yarn daemons if yarn is present
        if [ -f "${HADOOP_YARN_HOME}"/sbin/start-yarn.sh ]; then
        "${HADOOP_YARN_HOME}"/sbin/start-yarn.sh --config $HADOOP_CONF_DIR
        fi

      When we execute sqoop.sh, we need to use these two settings: $HADOOP_HDFS_HOME and $HADOOP_YARN_HOME. If we set them in our environment, the path should be ${HADOOP_HOME}/share/hadoop/hdfs and ${HADOOP_HOME}/share/hadoop/yarn, then it will cause start-all.sh faulure, if we set them as $HADOOP_HOME, sqoop2 start failure.
      HADOOP_COMMON_HOME=${HADOOP_COMMON_HOME:-${HADOOP_HOME}/share/hadoop/common}
      HADOOP_HDFS_HOME=${HADOOP_HDFS_HOME:-${HADOOP_HOME}/share/hadoop/hdfs}
      HADOOP_MAPRED_HOME=${HADOOP_MAPRED_HOME:-${HADOOP_HOME}/share/hadoop/mapreduce}
      HADOOP_YARN_HOME=${HADOOP_YARN_HOME:-${HADOOP_HOME}/share/hadoop/yarn}

      IMO, we can do some improvement in sqoop.sh as my attached patch. Just remove the validate to check if HADOOP_COMMON_HOME and HADOOP_YARN_HOME is null.

      Attachments

        Activity

          People

            Unassigned Unassigned
            peng.jianhua peng.jianhua
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated: