Details
-
New Feature
-
Status: Open
-
Major
-
Resolution: Unresolved
-
None
-
None
-
None
Description
There is no direct support on shipping extra local files/jars to job/task manager classpaths besides the application jar.
Users have to either build their own docker image to bake in the extra dependencies or upload to remote file systems first.
Features similar to `spark.files`, `spark.jars`, and `spark.archives` can be very helpful. (https://spark.apache.org/docs/latest/configuration.html)
From the client side, it can look like:
$ ./bin/flink run-application \ --target kubernetes-application \ -Dkubernetes.cluster-id=my-first-application-cluster \ -Dkubernetes.container.image=custom-image-name \ --files local:///local/file/path \ --jars local:///local/jar/path \ local:///opt/flink/usrlib/my-flink-job.jar