Description
If the value of param 'spark.streaming.concurrentJobs' is more than one, and the value of param 'spark.executor.cores' is more than one, there may be two or more tasks in one executor will use the same kafka consumer at the same time, then it will throw an exception: "KafkaConsumer is not safe for multi-threaded access"
Attachments
Issue Links
- duplicates
-
SPARK-19185 ConcurrentModificationExceptions with CachedKafkaConsumers when Windowing
- Closed
- links to