Tika
  1. Tika
  2. TIKA-888

NetCDF parser uses Java 6 JAR file and test/compilation fails with Java 1.5, although TIKA is Java 1.5

    Details

    • Type: Bug Bug
    • Status: In Progress
    • Priority: Major Major
    • Resolution: Unresolved
    • Affects Version/s: 1.0
    • Fix Version/s: None
    • Component/s: parser
    • Labels:
      None

      Description

      Lucene/Solr developers ran this tool before releasing Lucene/Solr 3.6 (Solr 3.6 is still required to run on Java 1.5, see SOLR-3295): http://code.google.com/p/versioncheck/

      Major.Minor Version : 50.0             JAVA compatibility : Java 1.6 platform: 45.3-50.0
      Number of classes : 60
      
      Classes are: 
      c:\Work\lucene-solr\.\solr\contrib\extraction\lib\netcdf-4.2-min.jar [:] ucar/unidata/geoloc/Bearing.class
      ...
      

      TIKA should use a 1.5 version of this class and especially do some Java 5 tests before releasing (as it's build dependencies says, it's minimum Java5). I tried to compile and run TIKA tests with Java 1.5 -> crash (Invalid class file format).

        Issue Links

          Activity

          Hide
          Chris A. Mattmann added a comment -

          Hi Uwe, thanks for reporting this. Unfortunately that version of the NetCDF Java library is published under 1.6, and it was a long process to get the Jar out onto Maven Central (I actually published it there, and became part of the NetCDF community ,helping them modernize their build to Maven, etc. etc.). See: http://www.unidata.ucar.edu/mailing_lists/archives/netcdf-java/2010/msg00129.html for an example of the discussion.

          I'm not sure what the problem is having a 1.6 dependency in Tika. We disable the tests for NetCDF java if the code isn't being compiled on a 1.6 platform, so all of the Tika code compiles just fine. And, because of the service provider mechanism that we use to load parser classes, only external users of the tika-parsers library that want to use NetCDF support for scientific data formats (NetCDF, OPeNDAP, HDF, etc.) need to worry about the 1.6 requirement. tika-bundle, and tika-app, (and now tika-server) slurp the jar file up as a dependency, but downstream users can simply use exclusion tactics (via Ivy or Maven) to not include it in their specific software projects.

          JDK 1.5 is EOL'ed, so eventually we'll all need to be on 1.6 or better (1.7+) anyways, so I'd favor simply revisiting the discussion I'm about to reference below before changing anything in Tika land or working to try and publish a new NetCDF jar that's 1.5 compat. Jukka and I discussed this in Tika land a while back (November 2010), see here: http://s.apache.org/6ij when we were finding Jenkins issues for Tika builds.

          Can't Solr simply use an Ivy exclusion (since I see you guys are moving to Ivy)?

          Show
          Chris A. Mattmann added a comment - Hi Uwe, thanks for reporting this. Unfortunately that version of the NetCDF Java library is published under 1.6, and it was a long process to get the Jar out onto Maven Central (I actually published it there, and became part of the NetCDF community ,helping them modernize their build to Maven, etc. etc.). See: http://www.unidata.ucar.edu/mailing_lists/archives/netcdf-java/2010/msg00129.html for an example of the discussion. I'm not sure what the problem is having a 1.6 dependency in Tika. We disable the tests for NetCDF java if the code isn't being compiled on a 1.6 platform, so all of the Tika code compiles just fine. And, because of the service provider mechanism that we use to load parser classes, only external users of the tika-parsers library that want to use NetCDF support for scientific data formats (NetCDF, OPeNDAP, HDF, etc.) need to worry about the 1.6 requirement. tika-bundle, and tika-app, (and now tika-server) slurp the jar file up as a dependency, but downstream users can simply use exclusion tactics (via Ivy or Maven) to not include it in their specific software projects. JDK 1.5 is EOL'ed, so eventually we'll all need to be on 1.6 or better (1.7+) anyways, so I'd favor simply revisiting the discussion I'm about to reference below before changing anything in Tika land or working to try and publish a new NetCDF jar that's 1.5 compat. Jukka and I discussed this in Tika land a while back (November 2010), see here: http://s.apache.org/6ij when we were finding Jenkins issues for Tika builds. Can't Solr simply use an Ivy exclusion (since I see you guys are moving to Ivy)?
          Hide
          Uwe Schindler added a comment -

          Thanks Chris,

          we are already planning to remove all parsers not useful for Solr (including NetCDF). We don't use transitive dependencies at the moment, because we want to be sure what libs are added and for the binary distribution we need to add license notes (which cannot be generated by Ivy) for every single JAR. So we would simply remove the dependency to ucar.

          The question is: The parser is still listed in META-INF, so when a Java 5 users tries to parse a NetCDF file, he gets a ClassNotFound by the NetCDF parser. Whats the best way to handle that? tika-config.xml is horrible to us, it would be good to pass a META-INF like list to the AutoDetectParser (I implemented that for another non-solr project we use at PANGAEA, where i used the META-INF list of Tika, deleted all unused parsers and passed them somehow to TIKA). This needed extra coding. Pointing e.g. AutodetectParser to a custom parser list would be nice and easy to manage in Solr (for me, too).

          On the mailing list we already discussed about better possibilities for Solr (Solr is only interested in full text, the metadata is mostly ignored), so parsing mp3 files is simply useless. A good idea for TIKA would be to have several tika-parsers packages, maybe one with "office document parsers", "images",... Are there any plans to split the parser package? This would make it easier for users to download a subset with all transitive dependencies and not get ClassNotFoundException if removing the wrong JAR files by hand.

          We disable the tests for NetCDF java if the code isn't being compiled on a 1.6 platform, so all of the Tika code compiles just fine

          I tried this a few weeks ago and with JDK 1.5, tests were failing.

          Show
          Uwe Schindler added a comment - Thanks Chris, we are already planning to remove all parsers not useful for Solr (including NetCDF). We don't use transitive dependencies at the moment, because we want to be sure what libs are added and for the binary distribution we need to add license notes (which cannot be generated by Ivy) for every single JAR. So we would simply remove the dependency to ucar. The question is: The parser is still listed in META-INF, so when a Java 5 users tries to parse a NetCDF file, he gets a ClassNotFound by the NetCDF parser. Whats the best way to handle that? tika-config.xml is horrible to us, it would be good to pass a META-INF like list to the AutoDetectParser (I implemented that for another non-solr project we use at PANGAEA, where i used the META-INF list of Tika, deleted all unused parsers and passed them somehow to TIKA). This needed extra coding. Pointing e.g. AutodetectParser to a custom parser list would be nice and easy to manage in Solr (for me, too). On the mailing list we already discussed about better possibilities for Solr (Solr is only interested in full text, the metadata is mostly ignored), so parsing mp3 files is simply useless. A good idea for TIKA would be to have several tika-parsers packages, maybe one with "office document parsers", "images",... Are there any plans to split the parser package? This would make it easier for users to download a subset with all transitive dependencies and not get ClassNotFoundException if removing the wrong JAR files by hand. We disable the tests for NetCDF java if the code isn't being compiled on a 1.6 platform, so all of the Tika code compiles just fine I tried this a few weeks ago and with JDK 1.5, tests were failing.
          Hide
          Chris A. Mattmann added a comment -

          We don't use transitive dependencies at the moment, because we want to be sure what libs are added and for the binary distribution we need to add license notes (which cannot be generated by Ivy) for every single JAR. So we would simply remove the dependency to ucar.

          Gotcha, OK, cool.

          The parser is still listed in META-INF, so when a Java 5 users tries to parse a NetCDF file, he gets a ClassNotFound by the NetCDF parser.

          Couldn't you take the Parser out of the file:

          org.apache.tika.parser.Parser

          (e.g., the Service loading mechanism). If you remove the org.apache.tika.parser.netcdf.NetCDFParser and org.apache.tika.parser.hdf.HDFParser entries from that file, the user will never reach the NetCDF or HDF Parser, right? I think you guys can provide your own custom copy of this file, and make sure it's at the root of the classpath in Solr Cell and then it will take your guys version over the baked in one for the tika-parsers jar.

          it would be good to pass a META-INF like list to the AutoDetectParser (I implemented that for another non-solr project we use at PANGAEA, where i used the META-INF list of Tika, deleted all unused parsers and passed them somehow to TIKA)

          This sounds cool. How is it different from the service provide mechanism though. I think it's serving a similar purpose, right?

          A good idea for TIKA would be to have several tika-parsers packages, maybe one with "office document parsers", "images",... Are there any plans to split the parser package?

          This was discussed a while back, check out for the thoughts there: https://issues.apache.org/jira/browse/TIKA-686

          I tried this a few weeks ago and with JDK 1.5, tests were failing.

          Our latest Jenkins build (which I think is locked to 1.5) passes (look at the one before I started mucking with tika-server):

          https://builds.apache.org/job/Tika-trunk/826/

          Show
          Chris A. Mattmann added a comment - We don't use transitive dependencies at the moment, because we want to be sure what libs are added and for the binary distribution we need to add license notes (which cannot be generated by Ivy) for every single JAR. So we would simply remove the dependency to ucar. Gotcha, OK, cool. The parser is still listed in META-INF, so when a Java 5 users tries to parse a NetCDF file, he gets a ClassNotFound by the NetCDF parser. Couldn't you take the Parser out of the file: org.apache.tika.parser.Parser (e.g., the Service loading mechanism). If you remove the org.apache.tika.parser.netcdf.NetCDFParser and org.apache.tika.parser.hdf.HDFParser entries from that file, the user will never reach the NetCDF or HDF Parser, right? I think you guys can provide your own custom copy of this file, and make sure it's at the root of the classpath in Solr Cell and then it will take your guys version over the baked in one for the tika-parsers jar. it would be good to pass a META-INF like list to the AutoDetectParser (I implemented that for another non-solr project we use at PANGAEA, where i used the META-INF list of Tika, deleted all unused parsers and passed them somehow to TIKA) This sounds cool. How is it different from the service provide mechanism though. I think it's serving a similar purpose, right? A good idea for TIKA would be to have several tika-parsers packages, maybe one with "office document parsers", "images",... Are there any plans to split the parser package? This was discussed a while back, check out for the thoughts there: https://issues.apache.org/jira/browse/TIKA-686 I tried this a few weeks ago and with JDK 1.5, tests were failing. Our latest Jenkins build (which I think is locked to 1.5) passes (look at the one before I started mucking with tika-server): https://builds.apache.org/job/Tika-trunk/826/
          Hide
          Uwe Schindler added a comment -

          Couldn't you take the Parser out of the file:

          org.apache.tika.parser.Parser

          (e.g., the Service loading mechanism). If you remove the org.apache.tika.parser.netcdf.NetCDFParser and org.apache.tika.parser.hdf.HDFParser entries from that file, the user will never reach the NetCDF or HDF Parser, right? I think you guys can provide your own custom copy of this file, and make sure it's at the root of the classpath in Solr Cell and then it will take your guys version over the baked in one for the tika-parsers jar.

          That's exactly not possible. SPI collects all /META-INF/services/org.apache.tika.parser.Parser it can find on classpath and collects all Parsers it can find. Removin parsers from one file does not help (order of classpath does not matter), as the SPI builds a set of all collected parsers from all META-INF files.

          That's the problem I ran into. By the way, Lucene 4.0 is now also using SPI for their codec/posting format support, so I know about what I am talking

          Because of this I replicated your single-SPI file parser and made a simple Collection<Parser> out of it, that I pass to AutoDetectParser. System SPI is then ignored.

          Show
          Uwe Schindler added a comment - Couldn't you take the Parser out of the file: org.apache.tika.parser.Parser (e.g., the Service loading mechanism). If you remove the org.apache.tika.parser.netcdf.NetCDFParser and org.apache.tika.parser.hdf.HDFParser entries from that file, the user will never reach the NetCDF or HDF Parser, right? I think you guys can provide your own custom copy of this file, and make sure it's at the root of the classpath in Solr Cell and then it will take your guys version over the baked in one for the tika-parsers jar. That's exactly not possible. SPI collects all /META-INF/services/org.apache.tika.parser.Parser it can find on classpath and collects all Parsers it can find. Removin parsers from one file does not help (order of classpath does not matter), as the SPI builds a set of all collected parsers from all META-INF files. That's the problem I ran into. By the way, Lucene 4.0 is now also using SPI for their codec/posting format support, so I know about what I am talking Because of this I replicated your single-SPI file parser and made a simple Collection<Parser> out of it, that I pass to AutoDetectParser. System SPI is then ignored.
          Hide
          Uwe Schindler added a comment -

          Another good idea would be to allow removal of Parsers from AutoDetectParser. Then we could use the default ctor and simply remove all unused ones.

          Show
          Uwe Schindler added a comment - Another good idea would be to allow removal of Parsers from AutoDetectParser. Then we could use the default ctor and simply remove all unused ones.
          Hide
          Jukka Zitting added a comment -

          The question is: The parser is still listed in META-INF, so when a Java 5 users tries to parse a NetCDF file, he gets a ClassNotFound by the NetCDF parser. Whats the best way to handle that?

          By default Tika should just ignore the ClassNotFoundException in such a case, so there should be no harm in having the NetCDF parser included in the services file.

          If that doesn't work (i.e. you get an error when starting Tika), we should fix Tika to catch the problem.

          Show
          Jukka Zitting added a comment - The question is: The parser is still listed in META-INF, so when a Java 5 users tries to parse a NetCDF file, he gets a ClassNotFound by the NetCDF parser. Whats the best way to handle that? By default Tika should just ignore the ClassNotFoundException in such a case, so there should be no harm in having the NetCDF parser included in the services file. If that doesn't work (i.e. you get an error when starting Tika), we should fix Tika to catch the problem.
          Hide
          Chris A. Mattmann added a comment -

          That's exactly not possible. SPI collects all /META-INF/services/org.apache.tika.parser.Parser it can find on classpath and collects all Parsers it can find. Removin parsers from one file does not help (order of classpath does not matter), as the SPI builds a set of all collected parsers from all META-INF files.

          Thanks for the clarification Uwe. I'm not an expert on the SPI, so I trust ya! Especially since you just implemented it yourself, heh.

          If that doesn't work (i.e. you get an error when starting Tika), we should fix Tika to catch the problem.

          +1. I'll take a look and try to reproduce and just make Tika swallow the ClassNotFoundException and happily move along.

          Show
          Chris A. Mattmann added a comment - That's exactly not possible. SPI collects all /META-INF/services/org.apache.tika.parser.Parser it can find on classpath and collects all Parsers it can find. Removin parsers from one file does not help (order of classpath does not matter), as the SPI builds a set of all collected parsers from all META-INF files. Thanks for the clarification Uwe. I'm not an expert on the SPI, so I trust ya! Especially since you just implemented it yourself, heh. If that doesn't work (i.e. you get an error when starting Tika), we should fix Tika to catch the problem. +1. I'll take a look and try to reproduce and just make Tika swallow the ClassNotFoundException and happily move along.
          Hide
          Uwe Schindler added a comment -

          Maybe it does nbot produce ClassNotFound. For me it produced a InvalidClassFile error with java 5
          I have to try... It it works with simply removing the JAR file, we should be fine!

          Show
          Uwe Schindler added a comment - Maybe it does nbot produce ClassNotFound. For me it produced a InvalidClassFile error with java 5 I have to try... It it works with simply removing the JAR file, we should be fine!
          Hide
          Nick Burch added a comment -

          Looking a little bit more, ServiceLoader.loadStaticServiceProviders(Class) will notify but skip over all Throwables when loading the parsers. This should cope with invalid or missing classes that are referenced by the parser.

          I wonder in this case if the NetCDF parser can be loaded just fine, but only blows up when actually called to parse something? Is anyone with Java 1.5 able to confirm, possibly posting the full stacktrace?

          (I can't get Tika from SVN to build on Java 1.5 at the moment, it's blowing up with some problems with Apache James Mime4J.)

          If it is a problem in the parser, rather than loading the parser, then we should either have the NetCDF parser try to use one of the ucar.nc2.NetcdfFile classes in it's init (to trigger a quick failure during parser loading), or have it catch more things in the parser stage itself

          Show
          Nick Burch added a comment - Looking a little bit more, ServiceLoader.loadStaticServiceProviders(Class) will notify but skip over all Throwables when loading the parsers. This should cope with invalid or missing classes that are referenced by the parser. I wonder in this case if the NetCDF parser can be loaded just fine, but only blows up when actually called to parse something? Is anyone with Java 1.5 able to confirm, possibly posting the full stacktrace? (I can't get Tika from SVN to build on Java 1.5 at the moment, it's blowing up with some problems with Apache James Mime4J.) If it is a problem in the parser, rather than loading the parser, then we should either have the NetCDF parser try to use one of the ucar.nc2.NetcdfFile classes in it's init (to trigger a quick failure during parser loading), or have it catch more things in the parser stage itself
          Hide
          Uwe Schindler added a comment -

          That was exactly my problem (it fails once it starts to parse). I will try to reproduce (I have many Java's installed).

          Show
          Uwe Schindler added a comment - That was exactly my problem (it fails once it starts to parse). I will try to reproduce (I have many Java's installed).
          Hide
          Nick Burch added a comment -

          If you can trigger the issue, and if it is in the parse method, any chance you could try pushing a bit of NetCDF up into the parser initialisation? I believe the current best practice is for parsers to fail early and be skipped by the service loader if they have a dependency issue, rather than waiting for parsing to break. (Well, as much as is possible for the parser to tell!)

          Show
          Nick Burch added a comment - If you can trigger the issue, and if it is in the parse method, any chance you could try pushing a bit of NetCDF up into the parser initialisation? I believe the current best practice is for parsers to fail early and be skipped by the service loader if they have a dependency issue, rather than waiting for parsing to break. (Well, as much as is possible for the parser to tell!)
          Hide
          Uwe Schindler added a comment - - edited

          I am trying:

          with latest svn I get:

          [debug] execute contextualize
          [INFO] [resources:resources {execution: default-resources}]
          [INFO] Using 'UTF-8' encoding to copy filtered resources.
          [INFO] Copying 3 resources
          [INFO] Copying 3 resources
          [INFO] [compiler:compile {execution: default-compile}]
          [INFO] Compiling 125 source files to C:\Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\target\classes
          [INFO] -------------------------------------------------------------
          [ERROR] COMPILATION ERROR :
          [INFO] -------------------------------------------------------------
          [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[413,24] cannot find symbol
          symbol  : method isEmpty()
          location: class java.lang.String
          [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[439,29] cannot find symbol
          symbol  : method isEmpty()
          location: class java.lang.String
          [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[439,56] cannot find symbol
          symbol  : method isEmpty()
          

          Downgrading to revision: 1331788 helped for that. But this time, the NetCDF parser tests did not produce test failure, so maybe its fixed in a later version.

          Maybe the issue comes from somewhere else, at the time when I opened the issue, I had a separate list of Parsers to enable (so it was not using SPI). I was expecting, that the Autodetect parser will stop parsing NetCDF when Parser cannot load, so by loading the parsers in my own code, I did not do the exception handling like your SPI does.

          I think we can close the issue!

          Show
          Uwe Schindler added a comment - - edited I am trying: with latest svn I get: [debug] execute contextualize [INFO] [resources:resources {execution: default-resources}] [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 3 resources [INFO] Copying 3 resources [INFO] [compiler:compile {execution: default-compile}] [INFO] Compiling 125 source files to C:\Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\target\classes [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[413,24] cannot find symbol symbol : method isEmpty() location: class java.lang.String [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[439,29] cannot find symbol symbol : method isEmpty() location: class java.lang.String [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[439,56] cannot find symbol symbol : method isEmpty() Downgrading to revision: 1331788 helped for that. But this time, the NetCDF parser tests did not produce test failure, so maybe its fixed in a later version. Maybe the issue comes from somewhere else, at the time when I opened the issue, I had a separate list of Parsers to enable (so it was not using SPI). I was expecting, that the Autodetect parser will stop parsing NetCDF when Parser cannot load, so by loading the parsers in my own code, I did not do the exception handling like your SPI does. I think we can close the issue!
          Hide
          Nick Burch added a comment -

          I've hopefully fixed the IPTC ANPA 1.6ism in r1331801, if you want to re-check

          Show
          Nick Burch added a comment - I've hopefully fixed the IPTC ANPA 1.6ism in r1331801, if you want to re-check
          Hide
          Uwe Schindler added a comment -

          Not all of them, I still get millions of String.isEmpty() (I only copied the first few messages, sorry, did not mention that). Here the fully compilation failure:

          [INFO] Compiling 125 source files to C:\Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\target\classes
          [INFO] -------------------------------------------------------------
          [ERROR] COMPILATION ERROR :
          [INFO] -------------------------------------------------------------
          [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[471,34] cannot find symbol
          symbol  : method isEmpty()
          location: class java.lang.String
          [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[522,32] cannot find symbol
          symbol  : method isEmpty()
          location: class java.lang.String
          [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[690,24] cannot find symbol
          symbol  : method isEmpty()
          location: class java.lang.String
          [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[690,48] cannot find symbol
          symbol  : method isEmpty()
          location: class java.lang.String
          [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[690,74] cannot find symbol
          symbol  : method isEmpty()
          location: class java.lang.String
          [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[690,99] cannot find symbol
          symbol  : method isEmpty()
          location: class java.lang.String
          [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[690,124] cannot find symbol
          symbol  : method isEmpty()
          location: class java.lang.String
          [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[729,26] cannot find symbol
          symbol  : method isEmpty()
          location: class java.lang.String
          [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[765,26] cannot find symbol
          symbol  : method isEmpty()
          location: class java.lang.String
          [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[765,53] cannot find symbol
          symbol  : method isEmpty()
          location: class java.lang.String
          [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[803,43] cannot find symbol
          symbol  : method getBytes(java.nio.charset.Charset)
          location: class java.lang.String
          [INFO] 11 errors
          [INFO] -------------------------------------------------------------
          
          Show
          Uwe Schindler added a comment - Not all of them, I still get millions of String.isEmpty() (I only copied the first few messages, sorry, did not mention that). Here the fully compilation failure: [INFO] Compiling 125 source files to C:\Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\target\classes [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[471,34] cannot find symbol symbol : method isEmpty() location: class java.lang.String [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[522,32] cannot find symbol symbol : method isEmpty() location: class java.lang.String [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[690,24] cannot find symbol symbol : method isEmpty() location: class java.lang.String [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[690,48] cannot find symbol symbol : method isEmpty() location: class java.lang.String [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[690,74] cannot find symbol symbol : method isEmpty() location: class java.lang.String [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[690,99] cannot find symbol symbol : method isEmpty() location: class java.lang.String [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[690,124] cannot find symbol symbol : method isEmpty() location: class java.lang.String [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[729,26] cannot find symbol symbol : method isEmpty() location: class java.lang.String [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[765,26] cannot find symbol symbol : method isEmpty() location: class java.lang.String [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[765,53] cannot find symbol symbol : method isEmpty() location: class java.lang.String [ERROR] \Users\Uwe Schindler\Projects\TIKA\svn\tika-parsers\src\main\java\org\apache\tika\parser\iptc\IptcAnpaParser.java:[803,43] cannot find symbol symbol : method getBytes(java.nio.charset.Charset) location: class java.lang.String [INFO] 11 errors [INFO] -------------------------------------------------------------
          Hide
          Nick Burch added a comment -

          Bah, missed some... Hopefully all gone now as of r1331946.

          Show
          Nick Burch added a comment - Bah, missed some... Hopefully all gone now as of r1331946.
          Hide
          Uwe Schindler added a comment -

          Passes now

          Show
          Uwe Schindler added a comment - Passes now

            People

            • Assignee:
              Chris A. Mattmann
              Reporter:
              Uwe Schindler
            • Votes:
              0 Vote for this issue
              Watchers:
              0 Start watching this issue

              Dates

              • Created:
                Updated:

                Development