Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-11481

orderBy with multiple columns in WindowSpec does not work properly

Log workAgile BoardRank to TopRank to BottomAttach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskConvert to sub-taskLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 1.5.1
    • Fix Version/s: 1.5.2, 1.6.0
    • Component/s: PySpark, SQL
    • Environment:

      All

    • Target Version/s:
    • Flags:
      Patch, Important

      Description

      When using multiple columns in the orderBy of a WindowSpec the order by seems to work only for the first column.

      A possible workaround is to sort previosly the DataFrame and then apply the window spec over the sorted DataFrame

      e.g.
      THIS NOT WORKS:
      window_sum = Window.partitionBy('user_unique_id').orderBy('creation_date', 'mib_id', 'day').rowsBetween(-sys.maxsize, 0)

      df = df.withColumn('user_version', func.sum(df.group_counter).over(window_sum))

      THIS WORKS WELL:
      df = df.sort('user_unique_id', 'creation_date', 'mib_id', 'day')
      window_sum = Window.partitionBy('user_unique_id').orderBy('creation_date', 'mib_id', 'day').rowsBetween(-sys.maxsize, 0)

      df = df.withColumn('user_version', func.sum(df.group_counter).over(window_sum))

      Also, can anybody confirm that this is a true workaround?

        Attachments

        Issue Links

          Activity

          $i18n.getText('security.level.explanation', $currentSelection) Viewable by All Users
          Cancel

            People

              Dates

              • Created:
                Updated:
                Resolved:

                Issue deployment