I use the Hive Table connector to write hive from Kafka and submit to yarn successfully, but it will always report during execution
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.JobConf
Before submitting, I executed "export HADOOP_CLASSPATH=`hadoop classpath`" to import hadoop dependencies.
I found in the JM log that the classpath already contains Hadoop dependencies, but an exception still occurs
The original jar that I submitted only contains code and does not contain dependencies,The program loads dependencies from the hadoop path and lib under the flink directory
The attachment contains flink lib, code, jar and JM log