Now is possible to configure the spark.shuffle.service.port, but there aren't an equivalent for host. Imagine that you are using an external shuffle deployed in Docker.
If you aren't using Host mode for the docker, you may want to use the internal ip address of the docker to connect to the external shuffle service.
Also you could use Calico, or just being attached the spark shuffle to a different IP that is used in the Spark executor (example hosts with multiple network interfaces).
So this is why I implemented the spark.shuffle.service.host configuration