Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-4326

unidoc is broken on master

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Critical
    • Resolution: Fixed
    • Affects Version/s: 1.3.0
    • Fix Version/s: 1.2.0
    • Component/s: Build, Documentation
    • Labels:
      None
    • Target Version/s:

      Description

      On master, `jekyll build` throws the following error:

      [error] /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/AppendOnlyMap.scala:205: value hashInt is not a member of com.google.common.hash.HashFunction
      [error]   private def rehash(h: Int): Int = Hashing.murmur3_32().hashInt(h).asInt()
      [error]                                                          ^
      [error] /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/ExternalAppendOnlyMap.scala:426: value limit is not a member of object com.google.common.io.ByteStreams
      [error]         val bufferedStream = new BufferedInputStream(ByteStreams.limit(fileStream, end - start))
      [error]                                                                  ^
      [error] /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/ExternalSorter.scala:558: value limit is not a member of object com.google.common.io.ByteStreams
      [error]         val bufferedStream = new BufferedInputStream(ByteStreams.limit(fileStream, end - start))
      [error]                                                                  ^
      [error] /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/OpenHashSet.scala:261: value hashInt is not a member of com.google.common.hash.HashFunction
      [error]   private def hashcode(h: Int): Int = Hashing.murmur3_32().hashInt(h).asInt()
      [error]                                                            ^
      [error] /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/Utils.scala:37: type mismatch;
      [error]  found   : java.util.Iterator[T]
      [error]  required: Iterable[?]
      [error]     collectionAsScalaIterable(ordering.leastOf(asJavaIterator(input), num)).iterator
      [error]                                                              ^
      [error] /Users/meng/src/spark/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTableOperations.scala:421: value putAll is not a member of com.google.common.cache.Cache[org.apache.hadoop.fs.FileStatus,parquet.hadoop.Footer]
      [error]           footerCache.putAll(newFooters)
      [error]                       ^
      [warn] /Users/meng/src/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/parquet/FakeParquetSerDe.scala:34: @deprecated now takes two arguments; see the scaladoc.
      [warn] @deprecated("No code should depend on FakeParquetHiveSerDe as it is only intended as a " +
      [warn]  ^
      [info] No documentation generated with unsucessful compiler run
      [warn] two warnings found
      [error] 6 errors found
      [error] (spark/scalaunidoc:doc) Scaladoc generation failed
      [error] Total time: 48 s, completed Nov 10, 2014 1:31:01 PM
      

      It doesn't happen on branch-1.2.

        Attachments

          Activity

            People

            • Assignee:
              mengxr Xiangrui Meng
              Reporter:
              mengxr Xiangrui Meng
            • Votes:
              0 Vote for this issue
              Watchers:
              6 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: