Description
I hope another boring tidy-up JIRA might be welcome. I'd like to fix most of the warnings that appear during build, so that developers don't become accustomed to them. The accompanying pull request contains a number of commits to quash most warnings observed through the mvn and sbt builds, although not all of them.
FIXED!
[WARNING] Parameter tasks is deprecated, use target instead
Just a matter of updating <tasks> -> <target> in inline Ant scripts.
WARNING: -p has been deprecated and will be reused for a different (but still very cool) purpose in ScalaTest 2.0. Please change all uses of -p to -R.
Goes away with updating scalatest plugin -> 1.0-RC2
[WARNING] Note: /Users/srowen/Documents/incubator-spark/core/src/test/scala/org/apache/spark/JavaAPISuite.java uses unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
Mostly @SuppressWarnings("unchecked") but needed a few more things to reveal the warning source: <fork>true</fork> (also needd for <maxmem>) and version 3.1 of the plugin. In a few cases some declaration changes were appropriate to avoid warnings.
/Users/srowen/Documents/incubator-spark/core/src/main/scala/org/apache/spark/util/IndestructibleActorSystem.scala:25: warning: Could not find any member to link for "akka.actor.ActorSystem".
/**
^
Getting several scaladoc errors like this and I'm not clear why it can't find the type – outside its module? Remove the links as they're evidently not linking anyway?
/Users/srowen/Documents/incubator-spark/repl/src/main/scala/org/apache/spark/repl/SparkIMain.scala:86: warning: Variable eval undefined in comment for class SparkIMain in class SparkIMain
$ has to be escaped as \$ in scaladoc, apparently
[WARNING] 'dependencyManagement.dependencies.dependency.exclusions.exclusion.artifactId' for org.apache.hadoop:hadoop-yarn-client:jar with value '*' does not match a valid id pattern. @ org.apache.spark:spark-parent:1.0.0-incubating-SNAPSHOT, /Users/srowen/Documents/incubator-spark/pom.xml, line 494, column 25
This one might need review.
This is valid Maven syntax, but, Maven still warns on it. I wanted to see if we can do without it.
These are trying to exclude:
- org.codehaus.jackson
- org.sonatype.sisu.inject
- org.xerial.snappy
org.sonatype.sisu.inject doesn't actually seem to be a dependency anyway. org.xerial.snappy is used by dependencies but the version seems to match anyway (1.0.5).
org.codehaus.jackson was intended to exclude 1.8.8, since Spark streaming wants 1.9.11 directly. But the exclusion is in the wrong place if so, since Spark depends straight on Avro, which is what brings in 1.8.8, still. (hadoop-client 1.0.4 includes Jackson 1.0.1, so that needs an exclusion, but the other Hadoop modules don't.)
HBase depends on 1.8.8 but figured it was intentional to leave that as it would not collide with Spark streaming.
(I understand this varies by Hadoop version but confirmed this is all the same for 1.0.4, 0.23.7, 2.2.0.)
NOT FIXED.
[warn] /Users/srowen/Documents/incubator-spark/streaming/src/test/scala/org/apache/spark/streaming/InputStreamsSuite.scala:305: method connect in class IOManager is deprecated: use the new implementation in package akka.io instead
[warn] override def preStart = IOManager(context.system).connect(new InetSocketAddress(port))
Not confident enough to fix this.
[WARNING] there were 6 feature warning(s); re-run with -feature for details
Don't know enough Scala to address these, yet.
[WARNING] We have a duplicate org/yaml/snakeyaml/scanner/ScannerImpl$Chomping.class in /Users/srowen/.m2/repository/org/yaml/snakeyaml/1.6/snakeyaml-1.6.jar
Probably addressable by being more careful about how binaries are packed though this appear to be ignorable; two identical copies of the class are colliding.
[WARNING] Zinc server is not available at port 3030 - reverting to normal incremental compile
and
[WARNING] JAR will be empty - no content was marked for inclusion!
Apparently harmless warnings, but I don't know how to disable them.