Details
-
Improvement
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
3.5.0
Description
PySpark requires users to have the correct JDK version (JDK 8+ for Spark<4; JDK 17+ for Spark>=4) installed locally.
We can make the Spark installation script install the JDK, so users don’t need to do this step manually.
Details
- When the entry point for a Spark class is invoked, the spark-class script checks if Java is installed in the user environment.
- If Java is not installed, the user is prompted to select whether they want to install JDK 17.
- If the user selects yes, JDK 17 is installed (using the install-jdk library) and JAVA_HOME variable and RUNNER are set appropriately. The Spark build will now work!
- If the user selects no, we provide them a brief description of how to install JDK manually.
Attachments
Issue Links
- links to