Uploaded image for project: 'Flink'
  1. Flink
  2. FLINK-9893

Cannot run Flink job in IDE when we have more than 1 taskslot

    XMLWordPrintableJSON

    Details

    • Type: Wish
    • Status: Closed
    • Priority: Major
    • Resolution: Duplicate
    • Affects Version/s: 1.5.1
    • Fix Version/s: 1.5.1
    • Component/s: None
    • Labels:
      None

      Description

      The problem is I cannot run it in IDE when I have more than 1 taskslot in my job.

      public class StreamingJob {
      
      public static void main(String[] args) throws Exception {
          // set up the streaming execution environment
          final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
      
      
          Properties kafkaProperties = new Properties();
      
          kafkaProperties.setProperty("bootstrap.servers", "localhost:9092");
      
          kafkaProperties.setProperty("group.id", "test");
      
          env.setParallelism(1);
      
      
          DataStream<String> kafkaSource = env.addSource(new FlinkKafkaConsumer010<>("flink-source", new SimpleStringSchema(),kafkaProperties)).name("Kafka-Source").slotSharingGroup("Kafka-Source");
      
          kafkaSource.print().slotSharingGroup("Print");
      
      
          env.execute("Flink Streaming Java API Skeleton");
      
      }
      }
      

      I know that job need 2 slot for this job and I can have two taskmanagers in Flink cluster, but how can I run it locally in IDE.

      Currently I have to specify the same slotSharingGroup name for all operator locally to have one slot. But it's not flexible.

      How do you handle it?

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              Filichkin Aleksandr Filichkin
            • Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: