Details
-
Sub-task
-
Status: Closed
-
Major
-
Resolution: Resolved
-
None
Description
Currently it will throw the following exception when submit a Python UDF job in YARN cluster:
Caused by: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: java.io.IOException: Cannot run program "null/bin/pyflink-udf-runner.sh": error=2, No such file or directory
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4966)
at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:211)
at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:202)
at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:185)
at org.apache.flink.python.AbstractPythonFunctionRunner.open(AbstractPythonFunctionRunner.java:171)
at org.apache.flink.table.runtime.operators.python.AbstractPythonScalarFunctionOperator$ProjectUdfInputPythonScalarFunctionRunner.open(AbstractPythonScalarFunctionOperator.java:177)
at org.apache.flink.streaming.api.operators.python.AbstractPythonFunctionOperator.open(AbstractPythonFunctionOperator.java:114)
at org.apache.flink.table.runtime.operators.python.AbstractPythonScalarFunctionOperator.open(AbstractPythonScalarFunctionOperator.java:137)
at org.apache.flink.table.runtime.operators.python.PythonScalarFunctionOperator.open(PythonScalarFunctionOperator.java:70)
at org.apache.flink.streaming.runtime.tasks.StreamTask.openAllOperators(StreamTask.java:565)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:412)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:696)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:521)
... 1 more
The reason is that pyflink-udf-runner.sh is not submitted and is not available for the operator.
Attachments
Issue Links
- links to