Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-18437

Inconsistent mark-down for `Note:`/`NOTE:`/`Note that` across Scala/Java/R/Python in API documentations

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Major
    • Resolution: Done
    • Affects Version/s: 2.0.1
    • Fix Version/s: None
    • Component/s: Documentation
    • Labels:
      None

      Description

      It seems in Scala/Java,

      • Note:
      • NOTE:
      • Note that
      • '''Note:'''

      are used in a mixed way. The last one seems correct as it marks down correctly. This will, for example, mark down it pretty[1].

      Also, it seems some '''Note:''' s are wrongly placed[2] which looks like the Note: for the last argument (I believe it meant to be for the API).

      For Python,

      • Note:
      • NOTE:
      • Note that
      • .. note:

      In this case, I also believe the last one marks down pretty[3] rather than the others[4][5][6].

      For R, it seems there are also,

      • Note:
      • NOTE:
      • Note that
      • @note

      In case of R, it seems pretty consistent. @note only contains the information about when the function came out such as locate since 1.5.0 without other information[7]. So, I am not too sure for this.

      It would be nicer if they are consistent, at least for Scala/Python/Java.

      [1] http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.SparkContext@hadoopFile[K,V,F<:org.apache.hadoop.mapred.InputFormat[K,V]](path:String)(implicitkm:scala.reflect.ClassTag[K],implicitvm:scala.reflect.ClassTag[V],implicitfm:scala.reflect.ClassTag[F]):org.apache.spark.rdd.RDD[(K,V)] (copy and paste all the URL here to check)
      [2] http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.SparkContext@hadoopRDD[K,V](conf:org.apache.hadoop.mapred.JobConf,inputFormatClass:Class[_<:org.apache.hadoop.mapred.InputFormat[K,V]],keyClass:Class[K],valueClass:Class[V],minPartitions:Int):org.apache.spark.rdd.RDD[(K,V)] (copy and paste all the URL here to check)
      [3] http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame.describe
      [4] http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.functions.date_format
      [5] http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.functions.grouping_id
      [6] http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame.head
      [7] http://spark.apache.org/docs/latest/api/R/index.html (clieck locate API for the example)

        Attachments

          Activity

            People

            • Assignee:
              hyukjin.kwon Hyukjin Kwon
              Reporter:
              hyukjin.kwon Hyukjin Kwon
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: