I think this impacts a lot more versions of Spark, but I don't know for sure because it takes a long time to test. As a part of doing corner case validation testing for spark rapids I found that if a window function has more than Int.MaxValue + 1 rows the result is silently truncated to that many rows. I have only tested this on 3.0.2 with row_number, but I suspect it will impact others as well. This is a really rare corner case, but because it is silent data corruption I personally think it is quite serious.
I had to make sure that I ran the above with at least 64GiB of heap for the executor (I did it in local mode and it worked, but took forever to run)