Uploaded image for project: 'Zeppelin'
  1. Zeppelin
  2. ZEPPELIN-4180

Error running hive paragraph due to lost credential

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: 0.8.0
    • Fix Version/s: None
    • Labels:
      None
    • Environment:

      Description

      User has random but frequent issue when running hive paragraph due to sudden loss of credential although user didn't change credential in the user profile.

      Trying to update user credential will come to the following error:

      Username \ Entity can not be empty.

      The workaround now is to:

      • Restart Zeppelin
      • Remove the jdbc.hive credential from the user profile
      • Add the jdbc.hive credential in the user profile

       

      The Error:

      INFO [2019-06-05 09:13:51,029] ({pool-2-thread-55} SchedulerFactory.java[jobStarted]:109) - Job 20190520-132926_1580001585 started by scheduler org.apache.zeppelin.interpreter.remote.RemoteInterpreter-hive:d286131:-shared_session
      {{ INFO [2019-06-05 09:13:51,031] ({pool-2-thread-55} Paragraph.java[jobRun]:380) - Run paragraph [paragraph_id: 20190520-132926_1580001585, interpreter: , note_id: 2EDZEYDMU, user: d286131]}}
      {{ WARN [2019-06-05 09:13:51,694] ({pool-2-thread-55} NotebookServer.java[afterStatusChange]:2302) - Job 20190520-132926_1580001585 is finished, status: ERROR, exception: null, result: %text org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException Unable to fetch table s_order. org.apache.hadoop.ipc.RemoteException(org.apache.ranger.authorization.hadoop.exceptions.RangerAccessControlException): Permission denied: user=anonymous, access=EXECUTE, inode="/data/prod/historic/rcrm/s_order"}}
      {{ at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:383)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1950)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4146)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1137)}}
      {{ at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:866)}}
      {{ at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)}}
      {{ at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)}}
      {{ at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)}}
      {{ at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)}}
      {{ at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)}}
      {{ at java.security.AccessController.doPrivileged(Native Method)}}
      {{ at javax.security.auth.Subject.doAs(Subject.java:422)}}
      {{ at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)}}
      {{ at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)}}at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:279)
      {{ at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:265)}}
      {{ at org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:303)}}
      {{ at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:244)}}
      {{ at org.apache.commons.dbcp2.DelegatingStatement.execute(DelegatingStatement.java:291)}}

      at org.apache.commons.dbcp2.DelegatingStatement.execute(DelegatingStatement.java:291)
      {{ at org.apache.zeppelin.jdbc.JDBCInterpreter.executeSql(JDBCInterpreter.java:737)}}
      {{ at org.apache.zeppelin.jdbc.JDBCInterpreter.interpret(JDBCInterpreter.java:820)}}
      {{ at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:103)}}
      {{ at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:633)}}
      {{ at org.apache.zeppelin.scheduler.Job.run(Job.java:188)}}
      {{ at org.apache.zeppelin.scheduler.ParallelScheduler$JobRunner.run(ParallelScheduler.java:162)}}
      {{ at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)}}
      {{ at java.util.concurrent.FutureTask.run(FutureTask.java:266)}}
      {{ at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)}}
      {{ at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)}}
      {{ at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)}}
      {{ at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)}}
      {{ at java.lang.Thread.run(Thread.java:745)}}
      Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException Unable to fetch table s_order. org.apache.hadoop.ipc.RemoteException(org.apache.ranger.authorization.hadoop.exceptions.RangerAccessControlException): Permission denied: user=anonymous, access=EXECUTE, inode="/data/prod/historic/rcrm/s_order"
      {{ at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:383)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1950)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4146)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1137)}}
      {{ at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:866)}}
      {{ at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)}}
      {{ at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)}}
      {{ at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)}}
      {{ at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)}}
      {{ at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)}}
      {{ at java.security.AccessController.doPrivileged(Native Method)}}
      {{ at javax.security.auth.Subject.doAs(Subject.java:422)}}
      {{ at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)}}
      {{ at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)}}at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:324)
      {{ at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:148)}}
      {{ at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:228)}}
      {{ at org.apache.hive.service.cli.operation.Operation.run(Operation.java:264)}}
      {{ at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:479)}}
      {{ at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:466)}}
      {{ at sun.reflect.GeneratedMethodAccessor105.invoke(Unknown Source)}}
      {{ at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)}}
      {{ at java.lang.reflect.Method.invoke(Method.java:498)}}
      {{ at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)}}
      {{ at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)}}
      {{ at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)}}
      {{ at java.security.AccessController.doPrivileged(Native Method)}}
      {{ at javax.security.auth.Subject.doAs(Subject.java:422)}}
      {{ at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)}}
      {{ at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)}}

      {{ at com.sun.proxy.$Proxy46.executeStatementAsync(Unknown Source)}}
      {{ at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:315)}}
      {{ at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:509)}}
      {{ at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1377)}}
      {{ at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1362)}}
      {{ at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)}}
      {{ at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)}}
      {{ at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)}}
      {{ at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)}}
      {{ ... 3 more}}
      Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.parse.SemanticException:Unable to fetch table s_order. org.apache.hadoop.ipc.RemoteException(org.apache.ranger.authorization.hadoop.exceptions.RangerAccessControlException): Permission denied: user=anonymous, access=EXECUTE, inode="/data/prod/historic/rcrm/s_order"
      {{ at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:383)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1950)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4146)}}
      {{ at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1137)}}
      {{ at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:866)}}
      {{ at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)}}
      {{ at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)}}
      {{ at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)}}
      {{ at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)}}
      {{ at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)}}
      {{ at java.security.AccessController.doPrivileged(Native Method)}}
      {{ at javax.security.auth.Subject.doAs(Subject.java:422)}}
      {{ at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)}}
      {{ at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)}}

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              szhao78 Mike Zhao
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated: