Uploaded image for project: 'Apache Drill'
  1. Apache Drill
  2. DRILL-5565

Directory Query fails with Permission denied: access=EXECUTE if dirN name is 'year=2017' or 'month=201704'

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 1.6.0
    • None
    • None
    • CentOS release 6.8

    Description

      running a query like this works fine, when the name dir0 contains numerics only:
      select * from all.my.records
      where dir0 >= '20170322'
      limit 10;

      if the dirN is named according to this convention: year=2017 we get one of the following problems:

      1. Either "system error permission denied" in:
      select * from all.my.records
      where dir0 >= 'year=2017'
      limit 10;

      SYSTEM ERROR: RemoteException: Permission denied: user=myuser, access=EXECUTE,
      inode: /user/myuser/all/my/records/year=2017/month=201701/day=20170101/application_1485464650247_1917/part-r-00000.gz.parquet":myuser:supergroup:rw-rr-

      at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
      at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
      at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
      at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
      at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
      at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6609)
      at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4223)
      at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:894)
      at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
      at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
      at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
      at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
      at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
      at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)
      at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)
      at java.security.AccessController.doPrivileged(Native Method)
      at javax.security.auth.Subject.doAs(Subject.java:422)
      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
      at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)

      2. OR, if the where clause only specifies numerics in the dirname, it does not blow up, but neither does it return the relevant data, since that where clause is not the correct path to our data:
      select * from all.my.records
      where dir0 >= '2017'
      limit 10;

      Attachments

        Activity

          People

            Unassigned Unassigned
            ehur ehur
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated: