Details
-
New Feature
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
Description
The goal is to distribute PySpark via pypi, so users can simply run Spark on a single node via "pip install pyspark" (or "pip install apache-spark").
Attachments
1.
|
Add a pip installer for PySpark | Resolved | Holden Karau | |
2.
|
Sign pip artifacts | Resolved | Holden Karau | |
3.
|
Add support for publishing to PyPI | Resolved | Holden Karau | |
4.
|
Build only a single pip package | Resolved | Reynold Xin | |
5.
|
Include the example data and third-party licenses in pyspark package | Resolved | Shuai Lin | |
6.
|
Upload to PyPi | Resolved | Holden Karau |