Uploaded image for project: 'Apache Ozone'
  1. Apache Ozone
  2. HDDS-3272

Smoke Test: hdfs commands failing on hadoop 27 docker-compose

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Duplicate
    • 0.5.0
    • None
    • test

    Description

      Discovered by bharat when testing 0.5.0-beta RC2.

       

       

      issue when running hdfs commands on hadoop 27
      docker-compose. I see the same test failing when running the smoke test.

      $ docker exec -it c7fe17804044 bash

      bash-4.4$ hdfs dfs -put /opt/hadoop/NOTICE.txt o3fs://bucket1.vol1/kk

      2020-03-22 04:40:14 WARN  NativeCodeLoader:60 - Unable to load
      native-hadoop library for your platform... using builtin-java classes where
      applicable

      2020-03-22 04:40:15 INFO  MetricsConfig:118 - Loaded properties from
      hadoop-metrics2.properties

      2020-03-22 04:40:16 INFO  MetricsSystemImpl:374 - Scheduled Metric snapshot
      period at 10 second(s).

      2020-03-22 04:40:16 INFO  MetricsSystemImpl:191 - XceiverClientMetrics
      metrics system started

      -put: Fatal internal error

      java.lang.NullPointerException: client is null

      at java.util.Objects.requireNonNull(Objects.java:228)

      at
      org.apache.hadoop.hdds.scm.XceiverClientRatis.getClient(XceiverClientRatis.java:201)

      at
      org.apache.hadoop.hdds.scm.XceiverClientRatis.sendRequestAsync(XceiverClientRatis.java:227)

      at
      org.apache.hadoop.hdds.scm.XceiverClientRatis.sendCommandAsync(XceiverClientRatis.java:305)

      at
      org.apache.hadoop.hdds.scm.storage.ContainerProtocolCalls.writeChunkAsync(ContainerProtocolCalls.java:315)

      at
      org.apache.hadoop.hdds.scm.storage.BlockOutputStream.writeChunkToContainer(BlockOutputStream.java:599)

      at
      org.apache.hadoop.hdds.scm.storage.BlockOutputStream.writeChunk(BlockOutputStream.java:452)

      at
      org.apache.hadoop.hdds.scm.storage.BlockOutputStream.handleFlush(BlockOutputStream.java:463)

      at
      org.apache.hadoop.hdds.scm.storage.BlockOutputStream.close(BlockOutputStream.java:486)

      at
      org.apache.hadoop.ozone.client.io.BlockOutputStreamEntry.close(BlockOutputStreamEntry.java:144)

      at
      org.apache.hadoop.ozone.client.io.KeyOutputStream.handleStreamAction(KeyOutputStream.java:481)

      at
      org.apache.hadoop.ozone.client.io.KeyOutputStream.handleFlushOrClose(KeyOutputStream.java:455)

      at
      org.apache.hadoop.ozone.client.io.KeyOutputStream.close(KeyOutputStream.java:508)

      at
      org.apache.hadoop.fs.ozone.OzoneFSOutputStream.close(OzoneFSOutputStream.java:56)

      at
      org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72)

      at
      org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)

      at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:62)

      at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:120)

      at
      org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:466)

      at
      org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.java:391)

      at
      org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:328)

      at
      org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:263)

      at
      org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:248)

      at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:317)

      at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:289)

      at
      org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.java:243)

      at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:271)

      at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:255)

      at
      org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:220)

      at
      org.apache.hadoop.fs.shell.CopyCommands$Put.processArguments(CopyCommands.java:267)

      at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:201)

      at org.apache.hadoop.fs.shell.Command.run(Command.java:165)

      at org.apache.hadoop.fs.FsShell.run(FsShell.java:287)

      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)

      at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)

      at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)

      The same command when using ozone fs is working fine.

       docker exec -it fe5d39cf6eed bash

      bash-4.2$ ozone fs -put /opt/hadoop/NOTICE.txt o3fs://bucket1.vol1/kk

      2020-03-22 04:41:10,999 [main] INFO impl.MetricsConfig: Loaded properties
      from hadoop-metrics2.properties

      2020-03-22 04:41:11,123 [main] INFO impl.MetricsSystemImpl: Scheduled
      Metric snapshot period at 10 second(s).

      2020-03-22 04:41:11,127 [main] INFO impl.MetricsSystemImpl:
      XceiverClientMetrics metrics system started

      bash-4.2$ ozone fs -ls o3fs://bucket1.vol1/

      Found 1 items

      rw-rw-rw   3 hadoop hadoop      17540 2020-03-22 04:41
      o3fs://bucket1.vol1/kk

      Attachments

        Activity

          People

            elek Marton Elek
            dineshchitlangia Dinesh Chitlangia
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: