Uploaded image for project: 'Ambari'
  1. Ambari
  2. AMBARI-6315

ATS failed to start

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • 1.6.1
    • 1.6.1
    • None
    • None
    • CentOS 6.4
      150 Node Cluster

    Description

      stderr: 
      2014-06-28 19:34:06,112 - Error while executing command 'start':
      Traceback (most recent call last):
        File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 111, in execute
          method(env)
        File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/YARN/package/scripts/application_timeline_server.py", line 41, in start
          self.configure(env) # FOR SECURITY
        File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/YARN/package/scripts/application_timeline_server.py", line 36, in configure
          yarn(name='apptimelineserver')
        File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/YARN/package/scripts/yarn.py", line 120, in yarn
          group=params.user_group
        File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
          self.env.run()
        File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
          self.run_action(resource, action)
        File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
          provider_action()
        File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 150, in action_create
          raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
      Fail: Applying Directory['/grid/0/hadoop/yarn/timeline'] failed, parent directory /grid/0/hadoop/yarn doesn't exist
       stdout:
      2014-06-28 19:34:05,738 - Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf --retry 10     http://horton-master-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
      2014-06-28 19:34:05,750 - Skipping Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf --retry 10     http://horton-master-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
      2014-06-28 19:34:05,841 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
      2014-06-28 19:34:05,842 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
      2014-06-28 19:34:05,854 - Skipping Link['/etc/hadoop/conf'] due to not_if
      2014-06-28 19:34:05,866 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': Template('hadoop-env.sh.j2'), 'owner': 'hdfs'}
      2014-06-28 19:34:05,867 - XmlConfig['core-site.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/etc/hadoop/conf', 'configurations': ...}
      2014-06-28 19:34:05,872 - Generating config: /etc/hadoop/conf/core-site.xml
      2014-06-28 19:34:05,872 - File['/etc/hadoop/conf/core-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None}
      2014-06-28 19:34:05,873 - Writing File['/etc/hadoop/conf/core-site.xml'] because contents don't match
      2014-06-28 19:34:05,884 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
      2014-06-28 19:34:05,914 - Execute['mkdir -p /usr/lib/hadoop/lib/native/Linux-i386-32; ln -sf /usr/lib/libsnappy.so /usr/lib/hadoop/lib/native/Linux-i386-32/libsnappy.so'] {}
      2014-06-28 19:34:05,927 - Execute['mkdir -p /usr/lib/hadoop/lib/native/Linux-amd64-64; ln -sf /usr/lib64/libsnappy.so /usr/lib/hadoop/lib/native/Linux-amd64-64/libsnappy.so'] {}
      2014-06-28 19:34:05,940 - Directory['/var/log/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
      2014-06-28 19:34:05,940 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
      2014-06-28 19:34:05,941 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True}
      2014-06-28 19:34:05,945 - File['/etc/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
      2014-06-28 19:34:05,947 - File['/etc/hadoop/conf/health_check'] {'content': Template('health_check-v2.j2'), 'owner': 'hdfs'}
      2014-06-28 19:34:05,947 - File['/etc/hadoop/conf/log4j.properties'] {'content': '...', 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
      2014-06-28 19:34:05,952 - File['/etc/hadoop/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
      2014-06-28 19:34:05,952 - File['/etc/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
      2014-06-28 19:34:05,953 - File['/etc/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
      2014-06-28 19:34:06,083 - Directory['/var/run/hadoop-yarn/yarn'] {'owner': 'yarn', 'group': 'hadoop', 'recursive': True}
      2014-06-28 19:34:06,084 - Directory['/var/log/hadoop-yarn/yarn'] {'owner': 'yarn', 'group': 'hadoop', 'recursive': True}
      2014-06-28 19:34:06,085 - Directory['/var/run/hadoop-mapreduce/mapred'] {'owner': 'mapred', 'group': 'hadoop', 'recursive': True}
      2014-06-28 19:34:06,085 - Directory['/var/log/hadoop-mapreduce/mapred'] {'owner': 'mapred', 'group': 'hadoop', 'recursive': True}
      2014-06-28 19:34:06,086 - Directory['/hadoop/yarn/local'] {'owner': 'yarn', 'ignore_failures': True, 'recursive': True}
      2014-06-28 19:34:06,086 - Directory['/hadoop/yarn/log'] {'owner': 'yarn', 'ignore_failures': True, 'recursive': True}
      2014-06-28 19:34:06,086 - Directory['/var/log/hadoop-yarn'] {'owner': 'yarn', 'ignore_failures': True, 'recursive': True}
      2014-06-28 19:34:06,086 - XmlConfig['core-site.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644, 'conf_dir': '/etc/hadoop/conf', 'configurations': ...}
      2014-06-28 19:34:06,093 - Generating config: /etc/hadoop/conf/core-site.xml
      2014-06-28 19:34:06,093 - File['/etc/hadoop/conf/core-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644}
      2014-06-28 19:34:06,094 - Writing File['/etc/hadoop/conf/core-site.xml'] because contents don't match
      2014-06-28 19:34:06,094 - XmlConfig['mapred-site.xml'] {'owner': 'yarn', 'group': 'hadoop', 'mode': 0644, 'conf_dir': '/etc/hadoop/conf', 'configurations': ...}
      2014-06-28 19:34:06,098 - Generating config: /etc/hadoop/conf/mapred-site.xml
      2014-06-28 19:34:06,098 - File['/etc/hadoop/conf/mapred-site.xml'] {'owner': 'yarn', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644}
      2014-06-28 19:34:06,099 - Writing File['/etc/hadoop/conf/mapred-site.xml'] because contents don't match
      2014-06-28 19:34:06,099 - XmlConfig['yarn-site.xml'] {'owner': 'yarn', 'group': 'hadoop', 'mode': 0644, 'conf_dir': '/etc/hadoop/conf', 'configurations': ...}
      2014-06-28 19:34:06,104 - Generating config: /etc/hadoop/conf/yarn-site.xml
      2014-06-28 19:34:06,105 - File['/etc/hadoop/conf/yarn-site.xml'] {'owner': 'yarn', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644}
      2014-06-28 19:34:06,105 - Writing File['/etc/hadoop/conf/yarn-site.xml'] because contents don't match
      2014-06-28 19:34:06,106 - XmlConfig['capacity-scheduler.xml'] {'owner': 'yarn', 'group': 'hadoop', 'mode': 0644, 'conf_dir': '/etc/hadoop/conf', 'configurations': ...}
      2014-06-28 19:34:06,109 - Generating config: /etc/hadoop/conf/capacity-scheduler.xml
      2014-06-28 19:34:06,110 - File['/etc/hadoop/conf/capacity-scheduler.xml'] {'owner': 'yarn', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644}
      2014-06-28 19:34:06,111 - Writing File['/etc/hadoop/conf/capacity-scheduler.xml'] because contents don't match
      2014-06-28 19:34:06,111 - Directory['/grid/0/hadoop/yarn/timeline'] {'owner': 'yarn', 'group': 'hadoop'}
      2014-06-28 19:34:06,111 - Creating directory Directory['/grid/0/hadoop/yarn/timeline']
      2014-06-28 19:34:06,112 - Error while executing command 'start':
      Traceback (most recent call last):
        File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 111, in execute
          method(env)
        File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/YARN/package/scripts/application_timeline_server.py", line 41, in start
          self.configure(env) # FOR SECURITY
        File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/YARN/package/scripts/application_timeline_server.py", line 36, in configure
          yarn(name='apptimelineserver')
        File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/YARN/package/scripts/yarn.py", line 120, in yarn
          group=params.user_group
        File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
          self.env.run()
        File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
          self.run_action(resource, action)
        File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
          provider_action()
        File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 150, in action_create
          raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
      Fail: Applying Directory['/grid/0/hadoop/yarn/timeline'] failed, parent directory /grid/0/hadoop/yarn doesn't exist
      

      Attachments

        Issue Links

          Activity

            People

              jonathanhurley Jonathan Hurley
              jonathanhurley Jonathan Hurley
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: