Details
-
New Feature
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
None
-
None
Description
There are many hacks within Spark's codebase to identify and compare Spark versions. We should add a simple utility to standardize these code paths, especially since there have been mistakes made in the past. This will let us add unit tests as well. This initial patch will only add methods for extracting major and minor versions as Int types in Scala.
Attachments
Issue Links
- blocks
-
SPARK-17462 Check for places within MLlib which should use VersionUtils to parse Spark version strings
- Resolved
- is required by
-
SPARK-16240 model loading backward compatibility for ml.clustering.LDA
- Resolved
- links to