Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-19217

Offer easy cast from vector to array

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Later
    • 2.1.0
    • None
    • ML, PySpark, SQL
    • None

    Description

      Working with ML often means working with DataFrames with vector columns. You can't save these DataFrames to storage (edit: at least as ORC) without converting the vector columns to array columns, and there doesn't appear to an easy way to make that conversion.

      This is a common enough problem that it is documented on Stack Overflow. The current solutions to making the conversion from a vector column to an array column are:

      1. Convert the DataFrame to an RDD and back
      2. Use a UDF

      Both approaches work fine, but it really seems like you should be able to do something like this instead:

      (le_data
          .select(
              col('features').cast('array').alias('features')
          ))
      

      We already have an ArrayType in pyspark.sql.types, but it appears that cast() doesn't support this conversion.

      Would this be an appropriate thing to add?

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              nchammas Nicholas Chammas
              Votes:
              1 Vote for this issue
              Watchers:
              7 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: