Uploaded image for project: 'Nutch'
  1. Nutch
  2. NUTCH-2512

Nutch does not build under JDK9

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Won't Fix
    • 1.14
    • 1.19
    • build, injector
    • None
    • Ubuntu 16.04 (All patches up to 02/20/2018)

      Oracle Java 9 - Oracle JDK 9 (Latest as off 02/22/2018)

    Description

      Nutch 1.14 (Source) does not compile properly under JDK 9

      Nutch 1.14 (Binary) does not function under Java 9

       

      When trying to Nuild Nutch, Ant complains about missing Sonar files then exits with:
      "BUILD FAILED
      /home/nutch/nutch/build.xml:79: Unparseable date: "01/25/1971 2:00 pm" "
       
      Once having commented out the "offending code" the Build finishes but the resulting Binary fails to function (as well as the Apache Compiled Binary distribution), Both exit with:
       
      Injecting seed URLs
      /home/nutch/nutch2/bin/nutch inject searchcrawl//crawldb urls/
      Injector: starting at 2018-02-21 02:02:16
      Injector: crawlDb: searchcrawl/crawldb
      Injector: urlDir: urls
      Injector: Converting injected urls to crawl db entries.
      WARNING: An illegal reflective access operation has occurred
      WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/nutch/nutch2/lib/hadoop-auth-2.7.4.jar) to method sun.security.krb5.Config.getInstance()
      WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
      WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
      WARNING: All illegal access operations will be denied in a future release
      Injector: java.lang.NullPointerException
              at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getBlockIndex(FileInputFormat.java:444)
              at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:413)
              at org.apache.hadoop.mapreduce.lib.input.DelegatingInputFormat.getSplits(DelegatingInputFormat.java:115)
              at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:301)
              at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:318)
              at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:196)
              at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
              at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
              at java.base/java.security.AccessController.doPrivileged(Native Method)
              at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
              at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
              at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
              at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
              at org.apache.nutch.crawl.Injector.inject(Injector.java:417)
              at org.apache.nutch.crawl.Injector.run(Injector.java:563)
              at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
              at org.apache.nutch.crawl.Injector.main(Injector.java:528)
       
      Error running:
        /home/nutch/nutch2/bin/nutch inject searchcrawl//crawldb urls/
      Failed with exit value 255.

      Attachments

        Issue Links

          There are no Sub-Tasks for this issue.

          Activity

            People

              Unassigned Unassigned
              Bl4ck1c3 Ralf
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: