Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Won't Fix
-
2.2.1, 2.3.0, 2.4.0
-
None
-
None
Description
After each Spark release (that's normally packaged with slightly newer version of py4j), we have to adjust our PySpark applications PYTHONPATH to point to correct version of python/py4j-src-0.9.2.zip.
Change to python/py4j-src-0.9.2.zip to python/py4j-src-0.9.6.zip, next release to something else etc.
Possible solutions. Would be great to either
- rename `python/py4j-src-0.x.y.zip` to `python/py4j-src-latest.zip` or `python/py4j-src-current.zip`
- or make a symlink in Spark distributed `py4j-src-current.zip` to whatever version Spark is shipped with.
In either case, if this would be solved, we wouldn't have to adjust PYTHONPATH during upgrades like Spark 2.2 to 2.3..
Thanks.