Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
2.3.1
-
None
-
None
Description
I have a fairly simple piece of code that reads from Kafka, applies some transformations - including applying a UDF - and writes the result to the console. Every time a batch is created, a new consumer is created (and not closed), eventually leading to a "too many open files" error.
I created a test case, with the code available here: https://github.com/aseigneurin/spark-kafka-issue
To reproduce:
- Start Kafka and create a topic called "persons"
- Run "Producer" to generate data
- Run "Consumer"
I am attaching the log where you can see a new consumer being initialized between every batch.
Please note this issue does not appear with Spark 2.2.2, and it does not appear either when I don't apply the UDF.
I am suspecting - although I did go far enough to confirm - that this issue is related to the improvement made in SPARK-23623.
Attachments
Attachments
Issue Links
- duplicates
-
SPARK-24987 Kafka Cached Consumer Leaking File Descriptors
- Resolved