Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-9014

Allow Python spark API to use built-in exponential operator

    XMLWordPrintableJSON

Details

    • Wish
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 1.4.0
    • 1.6.0
    • PySpark
    • None

    Description

      It would be nice if instead of saying:

      import pyspark.sql.functions as funcs
      df = df.withColumn("standarderror", funcs.sqrt(df["variance"]))

      ...if I could simply say:

      df = df.withColumn("standarderror", df["variance"] ** 0.5)

      Attachments

        Activity

          People

            0x0fff Alexey Grishchenko
            jspeis Jon Speiser
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: