Uploaded image for project: 'Flink'
  1. Flink
  2. FLINK-19126

Failed to run job in yarn-cluster mode due to No Executor found.

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Not A Problem
    • 1.11.1
    • None
    • Deployment / YARN
    • None

    Description

      I've build the flink package successfully, but when I run the below command, it failed to submit the jobs.

      [yanta@flink-1.11]$ bin/flink run -m yarn-cluster -p 2 -c org.apache.flink.examples.java.wordcount.WordCount examples/batch/WordCount.jar  --input hdfs:///user/yanta/aa.txt --output hdfs:///user/yanta/result.txt

      Setting HADOOP_CONF_DIR=/etc/hadoop/conf because no HADOOP_CONF_DIR or HADOOP_CLASSPATH was set.
      ------------------------------------------------------------ The program finished with the following exception:
      java.lang.IllegalStateException: No Executor found. Please make sure to export the HADOOP_CLASSPATH environment variable or have hadoop in your classpath. For more information refer to the "Deployment & Operations" section of the official Apache Flink documentation. at org.apache.flink.yarn.cli.FallbackYarnSessionCli.isActive(FallbackYarnSessionCli.java:59) at org.apache.flink.client.cli.CliFrontend.validateAndGetActiveCommandLine(CliFrontend.java:1090) at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:218) at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:916) at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:992) at org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30) at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:992)

      Attachments

        Activity

          People

            Unassigned Unassigned
            Tang Yan Tang Yan
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: