Details
-
Question
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
2.17.1
-
None
-
None
Description
I'm using the latest (v2.17.1) log4j2 dependencies in my grails 2.5.4 application. Following are the dependencies used:
- log4j-api
- log4j-core
- log4j-1.2-api
- log4j-slf4j-impl
We also have spark dependencies in our application for generating parquet files:
compile ("org.apache.spark:spark-core_2.12:3.1.2") { exclude group: 'org.slf4j' } compile ("org.apache.spark:spark-sql_2.12:3.1.2") { exclude group: 'org.slf4j' } compile ("org.apache.spark:spark-catalyst_2.12:3.1.2") { exclude group: 'org.slf4j' }
When we try to generate parquet file, we get following exception:
java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.sql.execution.datasources.DataSource.providingInstance(DataSource.scala:112) at org.apache.spark.sql.execution.datasources.DataSource.planForWriting(DataSource.scala:567) at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438) at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293) at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:874) at org.apache.spark.sql.DataFrameWriter$parquet.call(Unknown Source) . . . Caused by: java.lang.NoClassDefFoundError: org/slf4j/bridge/SLF4JBridgeHandler at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat.<init>(ParquetFileFormat.scala:63) ... 102 more Caused by: java.lang.ClassNotFoundException: org.slf4j.bridge.SLF4JBridgeHandler at java.net.URLClassLoader.findClass(URLClassLoader.java:387) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ... 103 more
This is happening because we have excluded org.slf4j from the spark dependencies as mentioned above.
When we don't exclude org.slf4j from the spark dependencies, we keep getting warning regarding multiple bindings of slf4j.
SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:<LOCAL_PATH>/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:<LOCAL_PATH>lib/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Can someone confirm if there's some issue with the configuration or is it a bug?