Uploaded image for project: 'Ambari'
  1. Ambari
  2. AMBARI-24273

hadoop-env is not regenerated when OneFS is used as a FileSystem

    XMLWordPrintableJSON

Details

    Description

      The before-ANY/shared_initialization.py only regenerates hadop_env if there is a namenode or dfs_type is set to HCFS

        def hook(self, env):
          import params
          env.set_params(params)
      
          setup_users()
          if params.has_namenode or params.dfs_type == 'HCFS':
            setup_hadoop_env()
          setup_java()
      

      This is no longer true because in the latest ambari-server we set dfs_type as follows:

          Map<String, ServiceInfo> serviceInfos = ambariMetaInfo.getServices(stackId.getStackName(), stackId.getStackVersion());
          for (ServiceInfo serviceInfoInstance : serviceInfos.values()) {
            if (serviceInfoInstance.getServiceType() != null) {
              LOG.debug("Adding {} to command parameters for {}", serviceInfoInstance.getServiceType(),
                  serviceInfoInstance.getName());
      
              clusterLevelParams.put(DFS_TYPE, serviceInfoInstance.getServiceType());
              break;
            }
          }
      

      This iterates over all of the stack service which will find HDFS first, so that the dfs_type will be HDFS instead of HCFS.

      Attachments

        Activity

          People

            amagyar Attila Magyar
            amagyar Attila Magyar
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Time Tracking

                Estimated:
                Original Estimate - Not Specified
                Not Specified
                Remaining:
                Remaining Estimate - 0h
                0h
                Logged:
                Time Spent - 1h
                1h