XMLWordPrintableJSON

    Details

    • Type: Sub-task
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 3.1.0
    • Component/s: None
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      I left a Datanode running overnight and found this in the logs in the morning:

      2017-10-18 23:51:54,391 ERROR datanode.DirectoryScanner: Error compiling report for the volume, StorageId: DS-e75ebc3c-6b12-424e-875a-a4ae1a4dcc29                                                                                            
      java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: URI scheme is not "file"                                                                                                                                         
              at java.util.concurrent.FutureTask.report(FutureTask.java:122)                                                                                                                                                                        
              at java.util.concurrent.FutureTask.get(FutureTask.java:192)                                                                                                                                                                           
              at org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.getDiskReport(DirectoryScanner.java:544)                                                                                                                                   
              at org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.scan(DirectoryScanner.java:393)                                                                                                                                            
              at org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.reconcile(DirectoryScanner.java:375)                                                                                                                                       
              at org.apache.hadoop.hdfs.server.datanode.DirectoryScanner.run(DirectoryScanner.java:320)                                                                                                                                             
              at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)                                                                                                                                                            
              at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)                                                                                                                                                                   
              at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)                                                                                                              
              at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)                                                                                                                     
              at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)                                                                                                                                                    
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)                                                                                                                                                    
              at java.lang.Thread.run(Thread.java:748)                                                                                                                                                                                              
      Caused by: java.lang.IllegalArgumentException: URI scheme is not "file"                                                                                                                                                                       
              at java.io.File.<init>(File.java:421)                                                                                                                                                                                                 
              at org.apache.hadoop.hdfs.server.datanode.fsdataset.FsVolumeSpi$ScanInfo.<init>(FsVolumeSpi.java:319)                                                                                                                                 
              at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.ProvidedVolumeImpl$ProvidedBlockPoolSlice.compileReport(ProvidedVolumeImpl.java:155)                                                                                         
              at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.ProvidedVolumeImpl.compileReport(ProvidedVolumeImpl.java:493)                                                                                                                
              at org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:620)                                                                                                                             
              at org.apache.hadoop.hdfs.server.datanode.DirectoryScanner$ReportCompiler.call(DirectoryScanner.java:581)                                                                                                                             
              at java.util.concurrent.FutureTask.run(FutureTask.java:266)                                                                                                                                                                           
              ... 3 more 
      

      The code in question tries to make a File from the URI (in this case s3a but anything in Provided storage would likely break here:

          public ScanInfo(long blockId, File blockFile, File metaFile,
              FsVolumeSpi vol, FileRegion fileRegion, long length) {
            this.blockId = blockId;
            String condensedVolPath =
                (vol == null || vol.getBaseURI() == null) ? null :
                  getCondensedPath(new File(vol.getBaseURI()).getAbsolutePath()); // <-------
            this.blockSuffix = blockFile == null ? null :
              getSuffix(blockFile, condensedVolPath);
            this.blockLength = length;
            if (metaFile == null) {
              this.metaSuffix = null;
            } else if (blockFile == null) {
              this.metaSuffix = getSuffix(metaFile, condensedVolPath);
            } else {
              this.metaSuffix = getSuffix(metaFile,
                  condensedVolPath + blockSuffix);
            }
            this.volume = vol;
            this.fileRegion = fileRegion;
          }
      

        Attachments

        1. HDFS-12685-HDFS-9806.004.patch
          8 kB
          Virajith Jalaparti
        2. HDFS-12685-HDFS-9806.003.patch
          9 kB
          Virajith Jalaparti
        3. HDFS-12685-HDFS-9806.002.patch
          4 kB
          Virajith Jalaparti
        4. HDFS-12685-HDFS-9806.001.patch
          4 kB
          Virajith Jalaparti

          Issue Links

            Activity

              People

              • Assignee:
                virajith Virajith Jalaparti
                Reporter:
                ehiggs Ewan Higgs
              • Votes:
                0 Vote for this issue
                Watchers:
                4 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: