Uploaded image for project: 'Ambari'
  1. Ambari
  2. AMBARI-22631

Hive MetaStore does not start after passing the mysql-connector path to ambari-server

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • 2.6.1
    • 2.6.1
    • ambari-server
    • None

    Description

      STR:

      1. Install ambari-server (with embedded DB) and setup hosts
      2. Initiate installation of HDP stack and follow the wizard to Configure Services Page
      3. In the Hive Config tab, follow the steps in the info box to install mysql-connector.
      4. Continue with the steps and initiate deployment of HDP

      Found that even after executing the following command (as per the info box), Hive Metastore does not start

      [root@sj-qe16187-6 ~]# ambari-server setup --jdbc-db=mysql --jdbc-driver=/root/mysql-connector-java.jar
      Using python  /usr/bin/python
      Setup ambari-server
      Copying /root/mysql-connector-java.jar to /var/lib/ambari-server/resources
      If you are updating existing jdbc driver jar for mysql with mysql-connector-java.jar. Please remove the old driver jar, from all hosts. Restarting services that need the driver, will automatically copy the new jar to the hosts.
      JDBC driver was successfully initialized.
      Ambari Server 'setup' completed successfully.
      

      Hive Metastore (on host 2), cannot find mysql-connector-java.jar in /usr/share/java :

      stderr:   /var/lib/ambari-agent/data/errors-202.txt
      
      Traceback (most recent call last):
        File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 203, in <module>
          HiveMetastore().execute()
        File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute
          method(env)
        File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 54, in start
          self.configure(env)
        File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 120, in locking_configure
          original_configure(obj, *args, **kw)
        File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 72, in configure
          hive(name = 'metastore')
        File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
          return fn(*args, **kwargs)
        File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py", line 310, in hive
          jdbc_connector(params.hive_jdbc_target, params.hive_previous_jdbc_jar)
        File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py", line 555, in jdbc_connector
          sudo=True
        File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
          self.env.run()
        File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
          self.run_action(resource, action)
        File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
          provider_action()
        File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
          tries=self.resource.tries, try_sleep=self.resource.try_sleep)
        File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
          result = function(command, **kwargs)
        File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
          tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
        File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
          result = _call(command, **kwargs_copy)
        File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
          raise ExecutionFailed(err_msg, code, out, err)
      resource_management.core.exceptions.ExecutionFailed: Execution of 'cp --remove-destination /usr/share/java/mysql-connector-java.jar /usr/hdp/current/hive-metastore/lib/mysql-connector-java.jar' returned 1. cp: cannot stat `/usr/share/java/mysql-connector-java.jar': No such file or directory
      stdout:   /var/lib/ambari-agent/data/output-202.txt
      
      2017-12-11 13:01:52,206 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.4.0-70 -> 2.6.4.0-70
      2017-12-11 13:01:52,210 - Using hadoop conf dir: /usr/hdp/2.6.4.0-70/hadoop/conf
      2017-12-11 13:01:52,418 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.4.0-70 -> 2.6.4.0-70
      2017-12-11 13:01:52,419 - Using hadoop conf dir: /usr/hdp/2.6.4.0-70/hadoop/conf
      2017-12-11 13:01:52,420 - Group['hdfs'] {}
      2017-12-11 13:01:52,422 - Group['hadoop'] {}
      2017-12-11 13:01:52,422 - Group['users'] {}
      2017-12-11 13:01:52,423 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2017-12-11 13:01:52,424 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2017-12-11 13:01:52,425 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
      2017-12-11 13:01:52,426 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2017-12-11 13:01:52,427 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
      2017-12-11 13:01:52,428 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
      2017-12-11 13:01:52,429 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
      2017-12-11 13:01:52,430 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2017-12-11 13:01:52,431 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2017-12-11 13:01:52,432 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2017-12-11 13:01:52,433 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
      2017-12-11 13:01:52,434 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
      2017-12-11 13:01:52,439 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
      2017-12-11 13:01:52,452 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
      2017-12-11 13:01:52,452 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
      2017-12-11 13:01:52,453 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
      2017-12-11 13:01:52,455 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
      2017-12-11 13:01:52,456 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
      2017-12-11 13:01:52,471 - call returned (0, '1002')
      2017-12-11 13:01:52,472 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1002'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
      2017-12-11 13:01:52,484 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1002'] due to not_if
      2017-12-11 13:01:52,485 - Group['hdfs'] {}
      2017-12-11 13:01:52,485 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hdfs']}
      2017-12-11 13:01:52,486 - FS Type: 
      2017-12-11 13:01:52,486 - Directory['/etc/hadoop'] {'mode': 0755}
      2017-12-11 13:01:52,510 - File['/usr/hdp/2.6.4.0-70/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
      2017-12-11 13:01:52,511 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
      2017-12-11 13:01:52,530 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
      2017-12-11 13:01:52,571 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
      2017-12-11 13:01:52,573 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
      2017-12-11 13:01:52,573 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
      2017-12-11 13:01:52,579 - File['/usr/hdp/2.6.4.0-70/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
      2017-12-11 13:01:52,581 - File['/usr/hdp/2.6.4.0-70/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
      2017-12-11 13:01:52,590 - File['/usr/hdp/2.6.4.0-70/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
      2017-12-11 13:01:52,605 - File['/usr/hdp/2.6.4.0-70/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
      2017-12-11 13:01:52,606 - File['/usr/hdp/2.6.4.0-70/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
      2017-12-11 13:01:52,607 - File['/usr/hdp/2.6.4.0-70/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
      2017-12-11 13:01:52,614 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
      2017-12-11 13:01:52,626 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
      2017-12-11 13:01:52,928 - MariaDB RedHat Support: false
      2017-12-11 13:01:52,932 - Using hadoop conf dir: /usr/hdp/2.6.4.0-70/hadoop/conf
      2017-12-11 13:01:52,943 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20}
      2017-12-11 13:01:52,980 - call returned (0, 'hive-server2 - 2.6.4.0-70')
      2017-12-11 13:01:52,981 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.4.0-70 -> 2.6.4.0-70
      2017-12-11 13:01:52,988 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://sj-qe16187-6.openstacklocal:8080/resources/CredentialUtil.jar'), 'mode': 0755}
      2017-12-11 13:01:52,990 - Not downloading the file from http://sj-qe16187-6.openstacklocal:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists
      2017-12-11 13:01:52,990 - checked_call[('/usr/jdk64/jdk1.8.0_112/bin/java', '-cp', '/var/lib/ambari-agent/cred/lib/*', 'org.apache.ambari.server.credentialapi.CredentialUtil', 'get', 'javax.jdo.option.ConnectionPassword', '-provider', 'jceks://file/var/lib/ambari-agent/cred/conf/hive_metastore/hive-site.jceks')] {}
      2017-12-11 13:01:54,125 - checked_call returned (0, 'SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".\nSLF4J: Defaulting to no-operation (NOP) logger implementation\nSLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.\nDec 11, 2017 1:01:53 PM org.apache.hadoop.util.NativeCodeLoader <clinit>\nWARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\nPassword')
      2017-12-11 13:01:54,137 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.4.0-70 -> 2.6.4.0-70
      2017-12-11 13:01:54,138 - Directory['/etc/hive'] {'mode': 0755}
      2017-12-11 13:01:54,138 - Directories to fill with configs: ['/usr/hdp/current/hive-metastore/conf', '/usr/hdp/current/hive-metastore/conf/conf.server']
      2017-12-11 13:01:54,139 - Directory['/etc/hive/2.6.4.0-70/0'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0755}
      2017-12-11 13:01:54,140 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/2.6.4.0-70/0', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
      2017-12-11 13:01:54,155 - Generating config: /etc/hive/2.6.4.0-70/0/mapred-site.xml
      2017-12-11 13:01:54,156 - File['/etc/hive/2.6.4.0-70/0/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
      2017-12-11 13:01:54,213 - File['/etc/hive/2.6.4.0-70/0/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
      2017-12-11 13:01:54,214 - File['/etc/hive/2.6.4.0-70/0/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
      2017-12-11 13:01:54,216 - File['/etc/hive/2.6.4.0-70/0/hive-exec-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
      2017-12-11 13:01:54,220 - File['/etc/hive/2.6.4.0-70/0/hive-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
      2017-12-11 13:01:54,221 - File['/etc/hive/2.6.4.0-70/0/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
      2017-12-11 13:01:54,223 - Directory['/etc/hive/2.6.4.0-70/0/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'create_parents': True, 'mode': 0700}
      2017-12-11 13:01:54,223 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/2.6.4.0-70/0/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
      2017-12-11 13:01:54,235 - Generating config: /etc/hive/2.6.4.0-70/0/conf.server/mapred-site.xml
      2017-12-11 13:01:54,235 - File['/etc/hive/2.6.4.0-70/0/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
      2017-12-11 13:01:54,292 - File['/etc/hive/2.6.4.0-70/0/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
      2017-12-11 13:01:54,293 - File['/etc/hive/2.6.4.0-70/0/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
      2017-12-11 13:01:54,295 - File['/etc/hive/2.6.4.0-70/0/conf.server/hive-exec-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
      2017-12-11 13:01:54,299 - File['/etc/hive/2.6.4.0-70/0/conf.server/hive-log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
      2017-12-11 13:01:54,300 - File['/etc/hive/2.6.4.0-70/0/conf.server/parquet-logging.properties'] {'content': ..., 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
      2017-12-11 13:01:54,301 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.jceks'] {'content': StaticFile('/var/lib/ambari-agent/cred/conf/hive_metastore/hive-site.jceks'), 'owner': 'hive', 'group': 'hadoop', 'mode': 0640}
      2017-12-11 13:01:54,302 - Writing File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.jceks'] because contents don't match
      2017-12-11 13:01:54,302 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf/conf.server', 'mode': 0600, 'configuration_attributes': {'hidden': {'javax.jdo.option.ConnectionPassword': 'HIVE_CLIENT,WEBHCAT_SERVER,HCAT,CONFIG_DOWNLOAD'}}, 'owner': 'hive', 'configurations': ...}
      2017-12-11 13:01:54,314 - Generating config: /usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml
      2017-12-11 13:01:54,314 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
      2017-12-11 13:01:54,517 - XmlConfig['hivemetastore-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hive-metastore/conf/conf.server', 'mode': 0600, 'configuration_attributes': {}, 'owner': 'hive', 'configurations': {'hive.service.metrics.hadoop2.component': 'hivemetastore', 'hive.metastore.metrics.enabled': 'true', 'hive.service.metrics.reporter': 'HADOOP2'}}
      2017-12-11 13:01:54,531 - Generating config: /usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml
      2017-12-11 13:01:54,532 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hivemetastore-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600, 'encoding': 'UTF-8'}
      2017-12-11 13:01:54,541 - File['/usr/hdp/current/hive-metastore/conf/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop', 'mode': 0600}
      2017-12-11 13:01:54,542 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
      2017-12-11 13:01:54,546 - File['/etc/security/limits.d/hive.conf'] {'content': Template('hive.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
      2017-12-11 13:01:54,547 - Execute[('cp', '--remove-destination', '/usr/share/java/mysql-connector-java.jar', '/usr/hdp/current/hive-metastore/lib/mysql-connector-java.jar')] {'path': ['/bin', '/usr/bin/'], 'sudo': True}
      
      Command failed after 1 tries
      

      Workaround was to manually transfer the mysql-connector jar to /usr/share/java on the host where Hive Metastore is installed

      Attachments

        Issue Links

          Activity

            People

              mpapirkovskyy Papirkovskyy Myroslav
              sjanardhan Srikanth Janardhan
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: