Details
-
Test
-
Status: Resolved
-
Major
-
Resolution: Done
-
None
-
None
-
None
Description
- FileSuite
[info] - binary file input as byte array *** FAILED *** (500 milliseconds) [info] "file:/C:/projects/spark/target/tmp/spark-e7c3a3b8-0a4b-4a7f-9ebe-7c4883e48624/record-bytestream-00000.bin" did not contain "C:\projects\spark\target\tmp\spark-e7c3a3b8-0a4b-4a7f-9ebe-7c4883e48624\record-bytestream-00000.bin" (FileSuite.scala:258) [info] org.scalatest.exceptions.TestFailedException: [info] at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:500) [info] at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1555) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:466) [info] at org.apache.spark.FileSuite$$anonfun$14.apply$mcV$sp(FileSuite.scala:258) [info] at org.apache.spark.FileSuite$$anonfun$14.apply(FileSuite.scala:239) [info] at org.apache.spark.FileSuite$$anonfun$14.apply(FileSuite.scala:239) ...
[info] - Get input files via old Hadoop API *** FAILED *** (1 second, 94 milliseconds) [info] Set("/C:/projects/spark/target/tmp/spark-cf5b1f8b-c5ed-43e0-8d17-546ebbfa8200/output/part-00000", "/C:/projects/spark/target/tmp/spark-cf5b1f8b-c5ed-43e0-8d17-546ebbfa8200/output/part-00001") did not equal Set("C:\projects\spark\target\tmp\spark-cf5b1f8b-c5ed-43e0-8d17-546ebbfa8200\output/part-00000", "C:\projects\spark\target\tmp\spark-cf5b1f8b-c5ed-43e0-8d17-546ebbfa8200\output/part-00001") (FileSuite.scala:535) [info] org.scalatest.exceptions.TestFailedException: [info] at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:500) [info] at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1555) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:466) [info] at org.apache.spark.FileSuite$$anonfun$29.apply$mcV$sp(FileSuite.scala:535) [info] at org.apache.spark.FileSuite$$anonfun$29.apply(FileSuite.scala:524) [info] at org.apache.spark.FileSuite$$anonfun$29.apply(FileSuite.scala:524) ...
[info] - Get input files via new Hadoop API *** FAILED *** (313 milliseconds) [info] Set("/C:/projects/spark/target/tmp/spark-12bc1540-1111-4df6-9c4d-79e0e614407c/output/part-00000", "/C:/projects/spark/target/tmp/spark-12bc1540-1111-4df6-9c4d-79e0e614407c/output/part-00001") did not equal Set("C:\projects\spark\target\tmp\spark-12bc1540-1111-4df6-9c4d-79e0e614407c\output/part-00000", "C:\projects\spark\target\tmp\spark-12bc1540-1111-4df6-9c4d-79e0e614407c\output/part-00001") (FileSuite.scala:549) [info] org.scalatest.exceptions.TestFailedException: [info] at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:500) [info] at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1555) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:466) [info] at org.apache.spark.FileSuite$$anonfun$30.apply$mcV$sp(FileSuite.scala:549) [info] at org.apache.spark.FileSuite$$anonfun$30.apply(FileSuite.scala:538) [info] at org.apache.spark.FileSuite$$anonfun$30.apply(FileSuite.scala:538) ...
- PipedRDDSuite
[info] - pipe with empty partition *** FAILED *** (672 milliseconds)
[info] Set(0, 4, 5) did not equal Set(0, 5, 6) (PipedRDDSuite.scala:145)
[info] org.scalatest.exceptions.TestFailedException:
[info] at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:500)
[info] at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1555)
[info] at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:466)
[info] at org.apache.spark.rdd.PipedRDDSuite$$anonfun$5.apply$mcV$sp(PipedRDDSuite.scala:145)
[info] at org.apache.spark.rdd.PipedRDDSuite$$anonfun$5.apply(PipedRDDSuite.scala:140)
[info] at org.apache.spark.rdd.PipedRDDSuite$$anonfun$5.apply(PipedRDDSuite.scala:140)
...
[info] - test pipe exports map_input_file *** FAILED *** (62 milliseconds) [info] java.lang.IllegalStateException: Subprocess exited with status 1. Command ran: printenv map_input_file [info] at org.apache.spark.rdd.PipedRDD$$anon$1.hasNext(PipedRDD.scala:178) [info] at scala.collection.Iterator$class.foreach(Iterator.scala:893) [info] at org.apache.spark.rdd.PipedRDD$$anon$1.foreach(PipedRDD.scala:163) [info] at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59) [info] at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104) [info] at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48) [info] at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310) [info] at org.apache.spark.rdd.PipedRDD$$anon$1.to(PipedRDD.scala:163) [info] at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302) [info] at org.apache.spark.rdd.PipedRDD$$anon$1.toBuffer(PipedRDD.scala:163) [info] at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289) [info] at org.apache.spark.rdd.PipedRDD$$anon$1.toArray(PipedRDD.scala:163) [info] at org.apache.spark.rdd.PipedRDDSuite.testExportInputFile(PipedRDDSuite.scala:247) [info] at org.apache.spark.rdd.PipedRDDSuite$$anonfun$10.apply$mcV$sp(PipedRDDSuite.scala:209) [info] at org.apache.spark.rdd.PipedRDDSuite$$anonfun$10.apply(PipedRDDSuite.scala:209) [info] at org.apache.spark.rdd.PipedRDDSuite$$anonfun$10.apply(PipedRDDSuite.scala:209) ...
[info] - test pipe exports mapreduce_map_input_file *** FAILED *** (172 milliseconds) [info] java.lang.IllegalStateException: Subprocess exited with status 1. Command ran: printenv mapreduce_map_input_file [info] at org.apache.spark.rdd.PipedRDD$$anon$1.hasNext(PipedRDD.scala:178) [info] at scala.collection.Iterator$class.foreach(Iterator.scala:893) [info] at org.apache.spark.rdd.PipedRDD$$anon$1.foreach(PipedRDD.scala:163) [info] at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59) [info] at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104) [info] at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48) [info] at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310) [info] at org.apache.spark.rdd.PipedRDD$$anon$1.to(PipedRDD.scala:163) [info] at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302) [info] at org.apache.spark.rdd.PipedRDD$$anon$1.toBuffer(PipedRDD.scala:163) [info] at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289) [info] at org.apache.spark.rdd.PipedRDD$$anon$1.toArray(PipedRDD.scala:163) [info] at org.apache.spark.rdd.PipedRDDSuite.testExportInputFile(PipedRDDSuite.scala:247) [info] at org.apache.spark.rdd.PipedRDDSuite$$anonfun$11.apply$mcV$sp(PipedRDDSuite.scala:213) [info] at org.apache.spark.rdd.PipedRDDSuite$$anonfun$11.apply(PipedRDDSuite.scala:213) [info] at org.apache.spark.rdd.PipedRDDSuite$$anonfun$11.apply(PipedRDDSuite.scala:213) ...
- TaskResultGetterSuite
[info] - handling results larger than max RPC message size *** FAILED *** (1 second, 579 milliseconds) [info] 1 did not equal 0 Expect result to be removed from the block manager. (TaskResultGetterSuite.scala:129) [info] org.scalatest.exceptions.TestFailedException: [info] at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:500) [info] at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1555) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:466) [info] at org.apache.spark.scheduler.TaskResultGetterSuite$$anonfun$4.apply$mcV$sp(TaskResultGetterSuite.scala:129) [info] at org.apache.spark.scheduler.TaskResultGetterSuite$$anonfun$4.apply(TaskResultGetterSuite.scala:121) [info] at org.apache.spark.scheduler.TaskResultGetterSuite$$anonfun$4.apply(TaskResultGetterSuite.scala:121) [info] ... [info] Cause: java.net.URISyntaxException: Illegal character in path at index 12: string:///C:\projects\spark\target\tmp\spark-93c485af-68da-440f-a907-aac7acd5fc25\repro\MyException.java [info] at java.net.URI$Parser.fail(URI.java:2848) [info] at java.net.URI$Parser.checkChars(URI.java:3021) [info] at java.net.URI$Parser.parseHierarchical(URI.java:3105) [info] at java.net.URI$Parser.parse(URI.java:3053) [info] at java.net.URI.<init>(URI.java:588) [info] at java.net.URI.create(URI.java:850) [info] at org.apache.spark.TestUtils$.org$apache$spark$TestUtils$$createURI(TestUtils.scala:112) [info] at org.apache.spark.TestUtils$JavaSourceFromString.<init>(TestUtils.scala:116) [info] at org.apache.spark.scheduler.TaskResultGetterSuite$$anonfun$6.apply$mcV$sp(TaskResultGetterSuite.scala:174) [info] at org.apache.spark.scheduler.TaskResultGetterSuite$$anonfun$6.apply(TaskResultGetterSuite.scala:169) [info] at org.apache.spark.scheduler.TaskResultGetterSuite$$anonfun$6.apply(TaskResultGetterSuite.scala:169) [info] at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22) [info] at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85) ...
[info] - failed task deserialized with the correct classloader (SPARK-11195) *** FAILED *** (0 milliseconds)
[info] java.lang.IllegalArgumentException: Illegal character in path at index 12: string:///C:\projects\spark\target\tmp\spark-93c485af-68da-440f-a907-aac7acd5fc25\repro\MyException.java
[info] at java.net.URI.create(URI.java:852)
[info] at org.apache.spark.TestUtils$.org$apache$spark$TestUtils$$createURI(TestUtils.scala:112)
[info] at org.apache.spark.TestUtils$JavaSourceFromString.<init>(TestUtils.scala:116)
[info] at org.apache.spark.scheduler.TaskResultGetterSuite$$anonfun$6.apply$mcV$sp(TaskResultGetterSuite.scala:174)
[info] at org.apache.spark.scheduler.TaskResultGetterSuite$$anonfun$6.apply(TaskResultGetterSuite.scala:169)
[info] at org.apache.spark.scheduler.TaskResultGetterSuite$$anonfun$6.apply(TaskResultGetterSuite.scala:169)
...
- SparkSubmitSuite
[info] java.lang.IllegalArgumentException: Illegal character in path at index 12: string:///C:\projects\spark\target\tmp\1481210831381-0\870903339\MyLib.java
[info] at java.net.URI.create(URI.java:852)
[info] at org.apache.spark.TestUtils$.org$apache$spark$TestUtils$$createURI(TestUtils.scala:112)
[info] at org.apache.spark.TestUtils$JavaSourceFromString.<init>(TestUtils.scala:116)
[info] at org.apache.spark.deploy.IvyTestUtils$.createJavaClass(IvyTestUtils.scala:145)
[info] at org.apache.spark.deploy.IvyTestUtils$.org$apache$spark$deploy$IvyTestUtils$$createLocalRepository(IvyTestUtils.scala:302)
[info] at org.apache.spark.deploy.IvyTestUtils$.createLocalRepositoryForTests(IvyTestUtils.scala:341)
[info] at org.apache.spark.deploy.IvyTestUtils$.withRepository(IvyTestUtils.scala:368)
[info] at org.apache.spark.deploy.SparkSubmitSuite$$anonfun$18.apply$mcV$sp(SparkSubmitSuite.scala:412)
[info] at org.apache.spark.deploy.SparkSubmitSuite$$anonfun$18.apply(SparkSubmitSuite.scala:408)
[info] at org.apache.spark.deploy.SparkSubmitSuite$$anonfun$18.apply(SparkSubmitSuite.scala:408)
...
I manually disabled some tests in ShuffleSuite due to path length limitation on Windows.
I can't proceed the tests further with the logs at the last bit.
[warn] there were 8 deprecation warnings; re-run with -deprecation for details [warn] one warning found [info] Compiling 1 Scala source and 3 Java sources to C:\projects\spark\external\java8-tests\target\scala-2.11\test-classes... [warn] C:\projects\spark\examples\src\main\scala\org\apache\spark\examples\mllib\AbstractParams.scala:41: method declarations in class TypeApi is deprecated: Use `decls` instead [warn] val allAccessors = tpe.declarations.collect { [warn] [warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list [info] Compiling 2 Scala sources to C:\projects\spark\repl\target\scala-2.11\test-classes... Exception in thread "ExecutorRunner for app-20161207123845-0000/62856" java.lang.OutOfMemoryError: GC overhead limit exceeded at java.util.Arrays.copyOf(Arrays.java:3332) at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:137) at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:121) at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:421) at java.lang.StringBuffer.append(StringBuffer.java:272) at org.apache.log4j.helpers.PatternParser$LiteralPatternConverter.format(PatternParser.java:419) at org.apache.log4j.PatternLayout.format(PatternLayout.java:506) at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:310) at org.apache.log4j.WriterAppender.append(WriterAppender.java:162) at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
For the full logs, please refer https://ci.appveyor.com/project/spark-test/spark/build/156-tmp-windows-base
Attachments
Issue Links
- relates to
-
SPARK-17591 Fix/investigate the failure of tests in Scala On Windows
- Resolved