Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
2.3.1
-
None
Description
To check the Event Log compression size with different compression technique mentioned in Spark Doc
Set below parameter in spark-default.conf of JDBC and JobHistory
1. spark.eventLog.compress=true
2. Enable spark.io.compression.codec = org.apache.spark.io.ZstdCompressionCodec
3. Restart the JDBC and Job History Services
4. Check the JDBC and Job History Logs
Exception throws
ava.lang.IllegalArgumentException: No short name for codec org.apache.spark.io.ZstdCompressionCodec.
at org.apache.spark.io.CompressionCodec$$anonfun$getShortName$2.apply(CompressionCodec.scala:94)
at org.apache.spark.io.CompressionCodec$$anonfun$getShortName$2.apply(CompressionCodec.scala:94)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.io.CompressionCodec$.getShortName(CompressionCodec.scala:94)
at org.apache.spark.SparkContext$$anonfun$9.apply(SparkContext.scala:414)
at org.apache.spark.SparkContext$$anonfun$9.apply(SparkContext.scala:414)
at scala.Option.map(Option.scala:146)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:414)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2507)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:939)
at