Uploaded image for project: 'Ambari'
  1. Ambari
  2. AMBARI-21988

[Hive Metastore] can't start hive metastore service,occured exception in python file "logger.py"

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Fixed
    • 2.4.2
    • 2.4.2
    • None
    • Hide
      stderr:
      Traceback (most recent call last):
        File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 259, in <module>
          HiveMetastore().execute()
        File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
          method(env)
        File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 59, in start
          self.configure(env)
        File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 73, in configure
          hive(name = 'metastore')
        File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
          return fn(*args, **kwargs)
        File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py", line 320, in hive
          user = params.hive_user
        File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
          self.env.run()
        File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
          self.run_action(resource, action)
        File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
          provider_action()
        File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 273, in action_run
          tries=self.resource.tries, try_sleep=self.resource.try_sleep)
        File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
          result = function(command, **kwargs)
        File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
          tries=tries, try_sleep=try_sleep)
        File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
          result = _call(command, **kwargs_copy)
        File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 292, in _call
          err_msg = Logger.filter_text("Execution of '{0}' returned {1}. {2}".format(command_alias, code, all_output))
        File "/usr/lib/python2.6/site-packages/resource_management/core/logger.py", line 101, in filter_text
          text = text.replace(unprotected_string, protected_string)
      UnicodeDecodeError: 'ascii' codec can't decode byte 0xe8 in position 984: ordinal not in range(128)
       stdout:
      2017-09-19 15:32:41,538 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.3.0-37
      2017-09-19 15:32:41,538 - Checking if need to create versioned conf dir /etc/hadoop/2.5.3.0-37/0
      2017-09-19 15:32:41,539 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
      2017-09-19 15:32:41,584 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist already', '')
      2017-09-19 15:32:41,585 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
      2017-09-19 15:32:41,630 - checked_call returned (0, '')
      2017-09-19 15:32:41,632 - Ensuring that hadoop has the correct symlink structure
      2017-09-19 15:32:41,632 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
      2017-09-19 15:32:41,830 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.3.0-37
      2017-09-19 15:32:41,831 - Checking if need to create versioned conf dir /etc/hadoop/2.5.3.0-37/0
      2017-09-19 15:32:41,831 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
      2017-09-19 15:32:41,876 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist already', '')
      2017-09-19 15:32:41,876 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
      2017-09-19 15:32:41,907 - checked_call returned (0, '')
      2017-09-19 15:32:41,908 - Ensuring that hadoop has the correct symlink structure
      2017-09-19 15:32:41,908 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
      2017-09-19 15:32:41,910 - Group['hadoop'] {}
      2017-09-19 15:32:41,911 - Group['users'] {}
      2017-09-19 15:32:41,912 - Group['spark'] {}
      2017-09-19 15:32:41,912 - Group['livy'] {}
      2017-09-19 15:32:41,912 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
      2017-09-19 15:32:41,913 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
      2017-09-19 15:32:41,914 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
      2017-09-19 15:32:41,915 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
      2017-09-19 15:32:41,916 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
      2017-09-19 15:32:41,916 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
      2017-09-19 15:32:41,918 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
      2017-09-19 15:32:41,919 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
      2017-09-19 15:32:41,920 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
      2017-09-19 15:32:41,921 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
      2017-09-19 15:32:41,923 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
      2017-09-19 15:32:41,924 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
      2017-09-19 15:32:41,925 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
      2017-09-19 15:32:41,927 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
      2017-09-19 15:32:41,930 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
      2017-09-19 15:32:41,939 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
      2017-09-19 15:32:41,939 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
      2017-09-19 15:32:41,944 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
      2017-09-19 15:32:41,947 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
      2017-09-19 15:32:41,955 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
      2017-09-19 15:32:41,955 - Group['hdfs'] {}
      2017-09-19 15:32:41,956 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
      2017-09-19 15:32:41,957 - FS Type:
      2017-09-19 15:32:41,958 - Directory['/etc/hadoop'] {'mode': 0755}
      2017-09-19 15:32:41,985 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
      2017-09-19 15:32:41,986 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
      2017-09-19 15:32:42,011 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
      2017-09-19 15:32:42,039 - Directory['/opt/disk1/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
      2017-09-19 15:32:42,041 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
      2017-09-19 15:32:42,042 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
      2017-09-19 15:32:42,050 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
      2017-09-19 15:32:42,054 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
      2017-09-19 15:32:42,055 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': ..., 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
      2017-09-19 15:32:42,078 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs', 'group': 'hadoop'}
      2017-09-19 15:32:42,080 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
      2017-09-19 15:32:42,082 - File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
      2017-09-19 15:32:42,091 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}
      2017-09-19 15:32:42,097 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
      2017-09-19 15:32:42,441 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.3.0-37
      2017-09-19 15:32:42,441 - Checking if need to create versioned conf dir /etc/hadoop/2.5.3.0-37/0
      2017-09-19 15:32:42,442 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
      2017-09-19 15:32:42,486 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist already', '')
      2017-09-19 15:32:42,487 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
      2017-09-19 15:32:42,531 - checked_call returned (0, '')
      2017-09-19 15:32:42,532 - Ensuring that hadoop has the correct symlink structure
      2017-09-19 15:32:42,532 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
      2017-09-19 15:32:42,544 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
      2017-09-19 15:32:42,588 - call returned (0, 'hive-server2 - 2.5.3.0-37')
      2017-09-19 15:32:42,589 - Stack Feature Version Info: stack_version=2.5, version=2.5.3.0-37, current_cluster_version=2.5.3.0-37 -> 2.5.3.0-37
      2017-09-19 15:32:42,608 - Directory['/etc/hive'] {'mode': 0755}
      2017-09-19 15:32:42,611 - Directories to fill with configs: ['/usr/hdp/current/hive-metastore/conf', '/usr/hdp/current/hive-metastore/conf/conf.server']
      2017-09-19 15:32:42,612 - Directory['/usr/hdp/current/hive-metastore/conf'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True}
      2017-09-19 15:32:42,613 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf', 'mode': 0644, 'configuration_attributes': {'final': {'mapreduce.reduce.memory.mb': 'true'}}, 'owner': 'hive', 'configurations': ...}
      2017-09-19 15:32:42,636 - Generating config: /usr/hdp/current/hive-metastore/conf/mapred-site.xml
      2017-09-19 15:32:42,637 - File['/usr/hdp/current/hive-metastore/conf/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
      2017-09-19 15:32:42,727 - File['/usr/hdp/current/hive-metastore/conf/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
      2017-09-19 15:32:42,728 - File['/usr/hdp/current/hive-metastore/conf/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
      2017-09-19 15:32:42,729 - File['/usr/hdp/current/hive-metastore/conf/hive-exec-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
      2017-09-19 15:32:42,730 - File['/usr/hdp/current/hive-metastore/conf/hive-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
      2017-09-19 15:32:42,731 - Directory['/usr/hdp/current/hive-metastore/conf/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True}
      2017-09-19 15:32:42,732 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf/conf.server', 'mode': 0644, 'configuration_attributes': {'final': {'mapreduce.reduce.memory.mb': 'true'}}, 'owner': 'hive', 'configurations': ...}
      2017-09-19 15:32:42,749 - Generating config: /usr/hdp/current/hive-metastore/conf/conf.server/mapred-site.xml
      2017-09-19 15:32:42,750 - File['/usr/hdp/current/hive-metastore/conf/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
      2017-09-19 15:32:42,819 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
      2017-09-19 15:32:42,819 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
      2017-09-19 15:32:42,820 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-exec-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
      2017-09-19 15:32:42,821 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
      2017-09-19 15:32:42,821 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf/conf.server', 'mode': 0644, 'configuration_attributes': {'hidden': {'javax.jdo.option.ConnectionPassword': 'HIVE_CLIENT,WEBHCAT_SERVER,HCAT,CONFIG_DOWNLOAD'}}, 'owner': 'hive', 'configurations': ...}
      2017-09-19 15:32:42,831 - Generating config: /usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml
      2017-09-19 15:32:42,831 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
      2017-09-19 15:32:42,987 - XmlConfig['hivemetastore-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf/conf.server', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': {'hive.service.metrics.reporter': 'JSON_FILE, JMX, HADOOP2', 'hive.metastore.metrics.enabled': 'true', 'hive.service.metrics.file.location': '/var/log/hive/hivemetastore-report.json', 'hive.service.metrics.hadoop2.component': 'hivemetastore'}}
      2017-09-19 15:32:42,996 - Generating config: /usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml
      2017-09-19 15:32:42,996 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
      2017-09-19 15:32:43,005 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop'}
      2017-09-19 15:32:43,006 - Writing File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-env.sh'] because contents don't match
      2017-09-19 15:32:43,006 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
      2017-09-19 15:32:43,010 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
      2017-09-19 15:32:43,011 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource('http://wjw-syjg-01:8080/resources/DBConnectionVerification.jar&#39;), 'mode': 0644}
      2017-09-19 15:32:43,011 - Not downloading the file from http://wjw-syjg-01:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists
      2017-09-19 15:32:43,016 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hadoop-metrics2-hivemetastore.properties'] {'content': Template('hadoop-metrics2-hivemetastore.properties.j2'), 'owner': 'hive', 'group': 'hadoop'}
      2017-09-19 15:32:43,017 - File['/var/lib/ambari-agent/tmp/start_metastore_script'] {'content': StaticFile('startMetastore.sh'), 'mode': 0755}
      2017-09-19 15:32:43,018 - Execute['export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -initSchema -dbType postgres -userName hive -passWord [PROTECTED] -verbose'] {'not_if': u"ambari-sudo.sh su hive -l -s /bin/bash -c 'export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -info -dbType postgres -userName hive -passWord [PROTECTED] -verbose'", 'user': 'hive'}

      Command failed after 1 tries
      Show
      stderr: Traceback (most recent call last):   File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 259, in <module>     HiveMetastore().execute()   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute     method(env)   File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 59, in start     self.configure(env)   File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 73, in configure     hive(name = 'metastore')   File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk     return fn(*args, **kwargs)   File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py", line 320, in hive     user = params.hive_user   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__     self.env.run()   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run     self.run_action(resource, action)   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action     provider_action()   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 273, in action_run     tries=self.resource.tries, try_sleep=self.resource.try_sleep)   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner     result = function(command, **kwargs)   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call     tries=tries, try_sleep=try_sleep)   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper     result = _call(command, **kwargs_copy)   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 292, in _call     err_msg = Logger.filter_text("Execution of '{0}' returned {1}. {2}".format(command_alias, code, all_output))   File "/usr/lib/python2.6/site-packages/resource_management/core/logger.py", line 101, in filter_text     text = text.replace(unprotected_string, protected_string) UnicodeDecodeError: 'ascii' codec can't decode byte 0xe8 in position 984: ordinal not in range(128)  stdout: 2017-09-19 15:32:41,538 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.3.0-37 2017-09-19 15:32:41,538 - Checking if need to create versioned conf dir /etc/hadoop/2.5.3.0-37/0 2017-09-19 15:32:41,539 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1} 2017-09-19 15:32:41,584 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist already', '') 2017-09-19 15:32:41,585 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False} 2017-09-19 15:32:41,630 - checked_call returned (0, '') 2017-09-19 15:32:41,632 - Ensuring that hadoop has the correct symlink structure 2017-09-19 15:32:41,632 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2017-09-19 15:32:41,830 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.3.0-37 2017-09-19 15:32:41,831 - Checking if need to create versioned conf dir /etc/hadoop/2.5.3.0-37/0 2017-09-19 15:32:41,831 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1} 2017-09-19 15:32:41,876 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist already', '') 2017-09-19 15:32:41,876 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False} 2017-09-19 15:32:41,907 - checked_call returned (0, '') 2017-09-19 15:32:41,908 - Ensuring that hadoop has the correct symlink structure 2017-09-19 15:32:41,908 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2017-09-19 15:32:41,910 - Group['hadoop'] {} 2017-09-19 15:32:41,911 - Group['users'] {} 2017-09-19 15:32:41,912 - Group['spark'] {} 2017-09-19 15:32:41,912 - Group['livy'] {} 2017-09-19 15:32:41,912 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-09-19 15:32:41,913 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']} 2017-09-19 15:32:41,914 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-09-19 15:32:41,915 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-09-19 15:32:41,916 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-09-19 15:32:41,916 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-09-19 15:32:41,918 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-09-19 15:32:41,919 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-09-19 15:32:41,920 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']} 2017-09-19 15:32:41,921 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-09-19 15:32:41,923 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-09-19 15:32:41,924 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-09-19 15:32:41,925 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-09-19 15:32:41,927 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-09-19 15:32:41,930 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2017-09-19 15:32:41,939 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if 2017-09-19 15:32:41,939 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'} 2017-09-19 15:32:41,944 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-09-19 15:32:41,947 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2017-09-19 15:32:41,955 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if 2017-09-19 15:32:41,955 - Group['hdfs'] {} 2017-09-19 15:32:41,956 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']} 2017-09-19 15:32:41,957 - FS Type: 2017-09-19 15:32:41,958 - Directory['/etc/hadoop'] {'mode': 0755} 2017-09-19 15:32:41,985 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2017-09-19 15:32:41,986 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2017-09-19 15:32:42,011 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'} 2017-09-19 15:32:42,039 - Directory['/opt/disk1/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'} 2017-09-19 15:32:42,041 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'} 2017-09-19 15:32:42,042 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'} 2017-09-19 15:32:42,050 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'} 2017-09-19 15:32:42,054 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'} 2017-09-19 15:32:42,055 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': ..., 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644} 2017-09-19 15:32:42,078 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs', 'group': 'hadoop'} 2017-09-19 15:32:42,080 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755} 2017-09-19 15:32:42,082 - File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'} 2017-09-19 15:32:42,091 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'} 2017-09-19 15:32:42,097 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755} 2017-09-19 15:32:42,441 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.3.0-37 2017-09-19 15:32:42,441 - Checking if need to create versioned conf dir /etc/hadoop/2.5.3.0-37/0 2017-09-19 15:32:42,442 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1} 2017-09-19 15:32:42,486 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist already', '') 2017-09-19 15:32:42,487 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False} 2017-09-19 15:32:42,531 - checked_call returned (0, '') 2017-09-19 15:32:42,532 - Ensuring that hadoop has the correct symlink structure 2017-09-19 15:32:42,532 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2017-09-19 15:32:42,544 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20} 2017-09-19 15:32:42,588 - call returned (0, 'hive-server2 - 2.5.3.0-37') 2017-09-19 15:32:42,589 - Stack Feature Version Info: stack_version=2.5, version=2.5.3.0-37, current_cluster_version=2.5.3.0-37 -> 2.5.3.0-37 2017-09-19 15:32:42,608 - Directory['/etc/hive'] {'mode': 0755} 2017-09-19 15:32:42,611 - Directories to fill with configs: ['/usr/hdp/current/hive-metastore/conf', '/usr/hdp/current/hive-metastore/conf/conf.server'] 2017-09-19 15:32:42,612 - Directory['/usr/hdp/current/hive-metastore/conf'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True} 2017-09-19 15:32:42,613 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf', 'mode': 0644, 'configuration_attributes': {'final': {'mapreduce.reduce.memory.mb': 'true'}}, 'owner': 'hive', 'configurations': ...} 2017-09-19 15:32:42,636 - Generating config: /usr/hdp/current/hive-metastore/conf/mapred-site.xml 2017-09-19 15:32:42,637 - File['/usr/hdp/current/hive-metastore/conf/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'} 2017-09-19 15:32:42,727 - File['/usr/hdp/current/hive-metastore/conf/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'} 2017-09-19 15:32:42,728 - File['/usr/hdp/current/hive-metastore/conf/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'} 2017-09-19 15:32:42,729 - File['/usr/hdp/current/hive-metastore/conf/hive-exec-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644} 2017-09-19 15:32:42,730 - File['/usr/hdp/current/hive-metastore/conf/hive-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644} 2017-09-19 15:32:42,731 - Directory['/usr/hdp/current/hive-metastore/conf/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True} 2017-09-19 15:32:42,732 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf/conf.server', 'mode': 0644, 'configuration_attributes': {'final': {'mapreduce.reduce.memory.mb': 'true'}}, 'owner': 'hive', 'configurations': ...} 2017-09-19 15:32:42,749 - Generating config: /usr/hdp/current/hive-metastore/conf/conf.server/mapred-site.xml 2017-09-19 15:32:42,750 - File['/usr/hdp/current/hive-metastore/conf/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'} 2017-09-19 15:32:42,819 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'} 2017-09-19 15:32:42,819 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'} 2017-09-19 15:32:42,820 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-exec-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644} 2017-09-19 15:32:42,821 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-log4j.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644} 2017-09-19 15:32:42,821 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf/conf.server', 'mode': 0644, 'configuration_attributes': {'hidden': {'javax.jdo.option.ConnectionPassword': 'HIVE_CLIENT,WEBHCAT_SERVER,HCAT,CONFIG_DOWNLOAD'}}, 'owner': 'hive', 'configurations': ...} 2017-09-19 15:32:42,831 - Generating config: /usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml 2017-09-19 15:32:42,831 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'} 2017-09-19 15:32:42,987 - XmlConfig['hivemetastore-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf/conf.server', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': {'hive.service.metrics.reporter': 'JSON_FILE, JMX, HADOOP2', 'hive.metastore.metrics.enabled': 'true', 'hive.service.metrics.file.location': '/var/log/hive/hivemetastore-report.json', 'hive.service.metrics.hadoop2.component': 'hivemetastore'}} 2017-09-19 15:32:42,996 - Generating config: /usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml 2017-09-19 15:32:42,996 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'} 2017-09-19 15:32:43,005 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop'} 2017-09-19 15:32:43,006 - Writing File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-env.sh'] because contents don't match 2017-09-19 15:32:43,006 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'} 2017-09-19 15:32:43,010 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644} 2017-09-19 15:32:43,011 - File['/usr/lib/ambari-agent/DBConnectionVerification.jar'] {'content': DownloadSource(' http://wjw-syjg-01:8080/resources/DBConnectionVerification.jar&#39;), 'mode': 0644} 2017-09-19 15:32:43,011 - Not downloading the file from http://wjw-syjg-01:8080/resources/DBConnectionVerification.jar, because /var/lib/ambari-agent/tmp/DBConnectionVerification.jar already exists 2017-09-19 15:32:43,016 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hadoop-metrics2-hivemetastore.properties'] {'content': Template('hadoop-metrics2-hivemetastore.properties.j2'), 'owner': 'hive', 'group': 'hadoop'} 2017-09-19 15:32:43,017 - File['/var/lib/ambari-agent/tmp/start_metastore_script'] {'content': StaticFile('startMetastore.sh'), 'mode': 0755} 2017-09-19 15:32:43,018 - Execute['export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -initSchema -dbType postgres -userName hive -passWord [PROTECTED] -verbose'] {'not_if': u"ambari-sudo.sh su hive -l -s /bin/bash -c 'export HIVE_CONF_DIR=/usr/hdp/current/hive-metastore/conf/conf.server ; /usr/hdp/current/hive-server2-hive2/bin/schematool -info -dbType postgres -userName hive -passWord [PROTECTED] -verbose'", 'user': 'hive'} Command failed after 1 tries

    Description

      1.start ambari server
      2.install Hive service
      3.create hive user as 'hive' with password 'hive' in postgresqll database
      4.create hive metaData database 'hive'
      5.grant all privilege to user hive
      6.modify pg_hba.conf file add below lines:
      local all all trust
      host all all 127.0.0.1/32 trust
      host all all ::1/128 trust
      7.configure hive
      8.try to start hive
      9.MetaStore service can't start

      Result:

      occured Exception:
      File "/usr/lib/python2.6/site-packages/resource_management/core/logger.py", line 101, in filter_text
      text = text.replace(unprotected_string, protected_string)
      UnicodeDecodeError: 'ascii' codec can't decode byte 0xe8 in position 984: ordinal not in range(128)

      Attachments

        Activity

          People

            Unassigned Unassigned
            WanderingEachDay kai zhao
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: