Details
-
Bug
-
Status: Resolved
-
Critical
-
Resolution: Fixed
-
1.0.0
-
None
Description
In order to run Spark in places with strict firewall rules, you need to be able to specify every port that's used between all parts of the stack.
Per the network activity section of the docs most of the ports are configurable, but there are a few ports that aren't configurable.
We need to make every port configurable to a particular port, so that we can run Spark in highly locked-down environments.
Attachments
Issue Links
- is blocked by
-
SPARK-1176 Adding port configuration for HttpBroadcast
- Resolved
-
SPARK-1174 Adding port configuration for HttpFileServer
- Resolved
- relates to
-
SPARK-4837 NettyBlockTransferService does not abide by spark.blockManager.port config option
- Resolved
-
SPARK-1174 Adding port configuration for HttpFileServer
- Resolved
- links to