2021-10-31 06:37:35,876 INFO [localhost-startStop-1] classloader.ClassLoaderUtils:55 : set sparkClassLoader :org.apache.kylin.spark.classloader.SparkClassLoader@42211d0b 2021-10-31 06:37:35,878 INFO [localhost-startStop-1] classloader.ClassLoaderUtils:75 : set originClassLoader :TomcatClassLoader context: unknown delegate: false repositories: ----------> Parent Classloader: java.net.URLClassLoader@1794d431 2021-10-31 06:37:36,227 INFO [localhost-startStop-1] common.KylinConfig:117 : Loading kylin-defaults.properties from file:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/tomcat/webapps/kylin/WEB-INF/lib/kylin-core-common-4.0.0.jar!/kylin-defaults.properties 2021-10-31 06:37:36,242 DEBUG [localhost-startStop-1] common.KylinConfig:363 : KYLIN_CONF property was not set, will seek KYLIN_HOME env variable 2021-10-31 06:37:36,242 INFO [localhost-startStop-1] common.KylinConfig:369 : Use KYLIN_HOME=/etc/hadoop/apache-kylin-4.0.0-bin-spark3 2021-10-31 06:37:36,301 INFO [localhost-startStop-1] common.KylinConfigBase:263 : Kylin Config was updated with kylin.metadata.url.identifier : kylin4 2021-10-31 06:37:36,302 INFO [localhost-startStop-1] common.KylinConfigBase:263 : Kylin Config was updated with kylin.log.spark-executor-properties-file : /etc/hadoop/apache-kylin-4.0.0-bin-spark3/conf/spark-executor-log4j.properties 2021-10-31 06:37:36,303 INFO [localhost-startStop-1] common.KylinConfig:160 : Initialized a new KylinConfig from getInstanceFromEnv : 1799516723 2021-10-31 06:37:36,928 INFO [localhost-startStop-1] core.SpringSecurityCoreVersion:75 : You are running with Spring Security Core 4.2.3.RELEASE 2021-10-31 06:37:36,931 INFO [localhost-startStop-1] config.SecurityNamespaceHandler:78 : Spring Security 'config' module version is 4.2.3.RELEASE 2021-10-31 06:37:36,946 INFO [localhost-startStop-1] method.GlobalMethodSecurityBeanDefinitionParser:176 : Using bean 'expressionHandler' as method ExpressionHandler implementation 2021-10-31 06:37:36,984 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'permitAll' for /api/user/authentication*/** 2021-10-31 06:37:36,984 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'hasRole('ROLE_ADMIN')' for /api/query/runningQueries 2021-10-31 06:37:36,984 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'hasRole('ROLE_ADMIN')' for /api/query/*/stop 2021-10-31 06:37:36,984 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'isAuthenticated()' for /api/query*/** 2021-10-31 06:37:36,984 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'isAuthenticated()' for /api/metadata*/** 2021-10-31 06:37:36,985 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'permitAll' for /api/**/metrics 2021-10-31 06:37:36,986 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'hasRole('ROLE_ADMIN')' for /api/**/jmetrics/** 2021-10-31 06:37:36,986 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'permitAll' for /api/cache*/** 2021-10-31 06:37:36,986 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'permitAll' for /api/streaming_coordinator/** 2021-10-31 06:37:36,986 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'permitAll' for /api/service_discovery/state/is_active_job_node 2021-10-31 06:37:36,986 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'hasAnyRole('ROLE_ANALYST')' for /api/cubes/src/tables 2021-10-31 06:37:36,986 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'isAuthenticated()' for /api/cubes*/** 2021-10-31 06:37:36,987 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'isAuthenticated()' for /api/models*/** 2021-10-31 06:37:36,987 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'isAuthenticated()' for /api/streaming*/** 2021-10-31 06:37:36,987 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'permitAll' for /api/jobs/spark 2021-10-31 06:37:36,987 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'isAuthenticated()' for /api/job*/** 2021-10-31 06:37:36,987 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'permitAll' for /api/admin/public_config 2021-10-31 06:37:36,987 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'permitAll' for /api/admin/version 2021-10-31 06:37:36,988 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'isAuthenticated()' for /api/projects 2021-10-31 06:37:36,988 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'hasRole('ROLE_ADMIN')' for /api/admin*/** 2021-10-31 06:37:36,988 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'permitAll' for /api/tables/**/snapshotLocalCache/** 2021-10-31 06:37:36,988 INFO [localhost-startStop-1] http.FilterInvocationSecurityMetadataSourceParser:194 : Creating access control expression attribute 'isAuthenticated()' for /api/** 2021-10-31 06:37:37,012 INFO [localhost-startStop-1] http.HttpSecurityBeanDefinitionParser:306 : Checking sorted filter chain: [Root bean: class [org.springframework.security.web.context.SecurityContextPersistenceFilter]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null, order = 200, Root bean: class [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null, order = 400, Root bean: class [org.springframework.security.web.header.HeaderWriterFilter]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null, order = 500, Root bean: class [org.springframework.security.web.authentication.logout.LogoutFilter]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null, order = 800, , order = 1200, Root bean: class [org.springframework.security.web.authentication.www.BasicAuthenticationFilter]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null, order = 1600, Root bean: class [org.springframework.security.web.savedrequest.RequestCacheAwareFilter]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null, order = 1700, Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=org.springframework.security.config.http.HttpConfigurationBuilder$SecurityContextHolderAwareRequestFilterBeanFactory#0; factoryMethodName=getBean; initMethodName=null; destroyMethodName=null, order = 1800, Root bean: class [org.springframework.security.web.authentication.AnonymousAuthenticationFilter]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null, order = 2100, Root bean: class [org.springframework.security.web.session.SessionManagementFilter]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null, order = 2200, Root bean: class [org.springframework.security.web.access.ExceptionTranslationFilter]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null, order = 2300, , order = 2400] 2021-10-31 06:37:37,840 DEBUG [localhost-startStop-1] security.PasswordPlaceholderConfigurer:174 : Loading properties file from InputStream resource [resource loaded through InputStream] 2021-10-31 06:37:38,189 INFO [localhost-startStop-1] filter.AnnotationSizeOfFilter:53 : Using regular expression provided through VM argument net.sf.ehcache.pool.sizeof.ignore.pattern for IgnoreSizeOf annotation : ^.*cache\..*IgnoreSizeOf$ 2021-10-31 06:37:38,194 INFO [localhost-startStop-1] sizeof.AgentLoader:88 : Located valid 'tools.jar' at '/usr/local/java/openlogic-openjdk-8u292-b10-linux-x64/jre/../lib/tools.jar' 2021-10-31 06:37:38,205 INFO [localhost-startStop-1] sizeof.JvmInformation:446 : Detected JVM data model settings of: 64-Bit OpenJDK JVM with Compressed OOPs 2021-10-31 06:37:38,454 INFO [localhost-startStop-1] sizeof.AgentLoader:198 : Extracted agent jar to temporary file /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/ehcache-sizeof-agent2888673995472231123.jar 2021-10-31 06:37:38,454 INFO [localhost-startStop-1] sizeof.AgentLoader:138 : Trying to load agent @ /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/ehcache-sizeof-agent2888673995472231123.jar 2021-10-31 06:37:38,459 INFO [localhost-startStop-1] impl.DefaultSizeOfEngine:111 : using Agent sizeof engine 2021-10-31 06:37:39,001 INFO [localhost-startStop-1] metrics.MetricsManager:142 : Kylin metrics monitor is not enabled 2021-10-31 06:37:39,169 INFO [localhost-startStop-1] util.Version:27 : HV000001: Hibernate Validator 5.1.3.Final 2021-10-31 06:37:39,903 INFO [localhost-startStop-1] init.InitialTaskManager:40 : Kylin service is starting..... 2021-10-31 06:37:39,925 INFO [localhost-startStop-1] init.InitialSparderContext:40 : Maybe this is job node, or switch is off, do not need to start Spark, false 2021-10-31 06:37:40,021 INFO [localhost-startStop-1] persistence.ResourceStore:90 : Using metadata url kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 for resource store 2021-10-31 06:37:40,034 INFO [localhost-startStop-1] persistence.JDBCConnectionManager:92 : Connecting to Jdbc with url:jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4 by user sdash 2021-10-31 06:37:40,399 INFO [localhost-startStop-1] persistence.JDBCConnectionManager:65 : Connected to MySQL 5.6.10-log 2021-10-31 06:37:40,415 INFO [localhost-startStop-1] persistence.JDBCResourceStore:115 : Table [kylin4] already exists 2021-10-31 06:37:40,418 INFO [localhost-startStop-1] persistence.JDBCResourceStore:115 : Table [kylin4_log] already exists 2021-10-31 06:37:40,423 DEBUG [localhost-startStop-1] cachesync.CachedCrudAssist:122 : Reloading AclRecord from kylin4(key='/acl')@kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:40,527 DEBUG [localhost-startStop-1] cachesync.CachedCrudAssist:155 : Loaded 1 AclRecord(s) out of 1 resource with 0 errors 2021-10-31 06:37:40,564 INFO [localhost-startStop-1] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.metadata.cachesync.Broadcaster 2021-10-31 06:37:40,567 DEBUG [localhost-startStop-1] cachesync.Broadcaster:102 : 1 nodes in the cluster: [localhost:7070] 2021-10-31 06:37:40,812 INFO [localhost-startStop-1] common.KylinConfigBase:263 : Kylin Config was updated with kylin.server.cluster-name : kylin4 2021-10-31 06:37:40,812 INFO [localhost-startStop-1] service.JobService:130 : starting to initialize an instance in cluster kylin4 2021-10-31 06:37:40,816 INFO [localhost-startStop-1] service.JobService:138 : Cluster servers: [localhost:7070] 2021-10-31 06:37:40,820 INFO [localhost-startStop-1] common.KylinConfigBase:263 : Kylin Config was updated with kylin.server.cluster-name : kylin4 2021-10-31 06:37:40,926 INFO [Thread-5] server.ZooKeeperServerMain:95 : Starting server 2021-10-31 06:37:40,934 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT 2021-10-31 06:37:40,934 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:host.name=ip-192-168-0-29.eu-west-1.compute.internal 2021-10-31 06:37:40,934 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:java.version=1.8.0-292 2021-10-31 06:37:40,934 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:java.vendor=OpenLogic-OpenJDK 2021-10-31 06:37:40,934 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:java.home=/usr/local/java/openlogic-openjdk-8u292-b10-linux-x64/jre 2021-10-31 06:37:40,934 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:java.class.path=/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/bin/bootstrap.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/bin/tomcat-juli.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/jasper.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-i18n-ru.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/ecj-4.4.2.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-dbcp.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-i18n-de.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-i18n-es.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/jsp-api.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/jasper-el.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-util.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-jdbc.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/websocket-api.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/kylin-spark-classloader-4.0.0.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-api.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/el-api.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/catalina.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/catalina-ha.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/annotations-api.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-i18n-fr.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-i18n-ko.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-i18n-zh-CN.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/catalina-ant.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-i18n-ja.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/servlet-api.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat7-websocket.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/catalina-tribes.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-coyote.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/conf:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/lib/kylin-parquet-job-4.0.0.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/lib/kylin-jdbc-4.0.0.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/lib/mysql-connector-java-5.1.47.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/ext/mysql-connector-java-5.1.47.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/ext/log4j-1.2.17.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/ext/slf4j-log4j12-1.7.30.jar:/etc/hadoop/hive-2.3.7/conf:/etc/hadoop/hive-2.3.7/lib/hive-common-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-shims-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-shims-common-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/log4j-api-2.6.2.jar:/etc/hadoop/hive-2.3.7/lib/guava-14.0.1.jar:/etc/hadoop/hive-2.3.7/lib/commons-lang-2.6.jar:/etc/hadoop/hive-2.3.7/lib/libthrift-0.9.3.jar:/etc/hadoop/hive-2.3.7/lib/httpclient-4.4.jar:/etc/hadoop/hive-2.3.7/lib/httpcore-4.4.jar:/etc/hadoop/hive-2.3.7/lib/commons-logging-1.2.jar:/etc/hadoop/hive-2.3.7/lib/commons-codec-1.4.jar:/etc/hadoop/hive-2.3.7/lib/curator-framework-2.7.1.jar:/etc/hadoop/hive-2.3.7/lib/curator-client-2.7.1.jar:/etc/hadoop/hive-2.3.7/lib/zookeeper-3.4.6.jar:/etc/hadoop/hive-2.3.7/lib/jline-2.12.jar:/etc/hadoop/hive-2.3.7/lib/netty-3.6.2.Final.jar:/etc/hadoop/hive-2.3.7/lib/hive-shims-0.23-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/javax.inject-1.jar:/etc/hadoop/hive-2.3.7/lib/protobuf-java-2.5.0.jar:/etc/hadoop/hive-2.3.7/lib/commons-io-2.4.jar:/etc/hadoop/hive-2.3.7/lib/jettison-1.1.jar:/etc/hadoop/hive-2.3.7/lib/activation-1.1.jar:/etc/hadoop/hive-2.3.7/lib/jackson-jaxrs-1.9.13.jar:/etc/hadoop/hive-2.3.7/lib/jackson-xc-1.9.13.jar:/etc/hadoop/hive-2.3.7/lib/jersey-guice-1.19.jar:/etc/hadoop/hive-2.3.7/lib/jersey-server-1.14.jar:/etc/hadoop/hive-2.3.7/lib/asm-3.1.jar:/etc/hadoop/hive-2.3.7/lib/commons-compress-1.9.jar:/etc/hadoop/hive-2.3.7/lib/jersey-client-1.9.jar:/etc/hadoop/hive-2.3.7/lib/commons-cli-1.2.jar:/etc/hadoop/hive-2.3.7/lib/commons-collections-3.2.2.jar:/etc/hadoop/hive-2.3.7/lib/hive-shims-scheduler-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-storage-api-2.4.0.jar:/etc/hadoop/hive-2.3.7/lib/commons-lang3-3.1.jar:/etc/hadoop/hive-2.3.7/lib/orc-core-1.3.4.jar:/etc/hadoop/hive-2.3.7/lib/aircompressor-0.8.jar:/etc/hadoop/hive-2.3.7/lib/slice-0.29.jar:/etc/hadoop/hive-2.3.7/lib/jol-core-0.2.jar:/etc/hadoop/hive-2.3.7/lib/geronimo-jta_1.1_spec-1.1.1.jar:/etc/hadoop/hive-2.3.7/lib/mail-1.4.1.jar:/etc/hadoop/hive-2.3.7/lib/geronimo-jaspic_1.0_spec-1.0.jar:/etc/hadoop/hive-2.3.7/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/etc/hadoop/hive-2.3.7/lib/asm-commons-3.1.jar:/etc/hadoop/hive-2.3.7/lib/asm-tree-3.1.jar:/etc/hadoop/hive-2.3.7/lib/joda-time-2.8.1.jar:/etc/hadoop/hive-2.3.7/lib/log4j-1.2-api-2.6.2.jar:/etc/hadoop/hive-2.3.7/lib/log4j-core-2.6.2.jar:/etc/hadoop/hive-2.3.7/lib/log4j-web-2.6.2.jar:/etc/hadoop/hive-2.3.7/lib/ant-1.9.1.jar:/etc/hadoop/hive-2.3.7/lib/ant-launcher-1.9.1.jar:/etc/hadoop/hive-2.3.7/lib/json-1.8.jar:/etc/hadoop/hive-2.3.7/lib/metrics-core-3.1.0.jar:/etc/hadoop/hive-2.3.7/lib/metrics-jvm-3.1.0.jar:/etc/hadoop/hive-2.3.7/lib/metrics-json-3.1.0.jar:/etc/hadoop/hive-2.3.7/lib/jackson-databind-2.6.5.jar:/etc/hadoop/hive-2.3.7/lib/jackson-annotations-2.6.0.jar:/etc/hadoop/hive-2.3.7/lib/jackson-core-2.6.5.jar:/etc/hadoop/hive-2.3.7/lib/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:/etc/hadoop/hive-2.3.7/lib/commons-math3-3.6.1.jar:/etc/hadoop/hive-2.3.7/lib/commons-httpclient-3.0.1.jar:/etc/hadoop/hive-2.3.7/lib/avro-1.7.7.jar:/etc/hadoop/hive-2.3.7/lib/paranamer-2.3.jar:/etc/hadoop/hive-2.3.7/lib/snappy-java-1.0.5.jar:/etc/hadoop/hive-2.3.7/lib/gson-2.2.4.jar:/etc/hadoop/hive-2.3.7/lib/curator-recipes-2.7.1.jar:/etc/hadoop/hive-2.3.7/lib/jsr305-3.0.0.jar:/etc/hadoop/hive-2.3.7/lib/htrace-core-3.1.0-incubating.jar:/etc/hadoop/hive-2.3.7/lib/hive-serde-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-service-rpc-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/jasper-compiler-5.5.23.jar:/etc/hadoop/hive-2.3.7/lib/ant-1.6.5.jar:/etc/hadoop/hive-2.3.7/lib/jasper-runtime-5.5.23.jar:/etc/hadoop/hive-2.3.7/lib/commons-el-1.0.jar:/etc/hadoop/hive-2.3.7/lib/libfb303-0.9.3.jar:/etc/hadoop/hive-2.3.7/lib/opencsv-2.3.jar:/etc/hadoop/hive-2.3.7/lib/parquet-hadoop-bundle-1.8.1.jar:/etc/hadoop/hive-2.3.7/lib/hive-metastore-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/javolution-5.5.1.jar:/etc/hadoop/hive-2.3.7/lib/findbugs-annotations-1.3.9-1.jar:/etc/hadoop/hive-2.3.7/lib/netty-all-4.0.52.Final.jar:/etc/hadoop/hive-2.3.7/lib/jcodings-1.0.8.jar:/etc/hadoop/hive-2.3.7/lib/joni-2.1.2.jar:/etc/hadoop/hive-2.3.7/lib/bonecp-0.8.0.RELEASE.jar:/etc/hadoop/hive-2.3.7/lib/HikariCP-2.5.1.jar:/etc/hadoop/hive-2.3.7/lib/datanucleus-api-jdo-4.2.4.jar:/etc/hadoop/hive-2.3.7/lib/datanucleus-core-4.1.17.jar:/etc/hadoop/hive-2.3.7/lib/datanucleus-rdbms-4.1.19.jar:/etc/hadoop/hive-2.3.7/lib/commons-pool-1.5.4.jar:/etc/hadoop/hive-2.3.7/lib/commons-dbcp-1.4.jar:/etc/hadoop/hive-2.3.7/lib/jdo-api-3.0.1.jar:/etc/hadoop/hive-2.3.7/lib/jta-1.1.jar:/etc/hadoop/hive-2.3.7/lib/jpam-1.1.jar:/etc/hadoop/hive-2.3.7/lib/javax.jdo-3.2.0-m3.jar:/etc/hadoop/hive-2.3.7/lib/transaction-api-1.1.jar:/etc/hadoop/hive-2.3.7/lib/antlr-runtime-3.5.2.jar:/etc/hadoop/hive-2.3.7/lib/hive-testutils-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/tempus-fugit-1.1.jar:/etc/hadoop/hive-2.3.7/lib/hive-exec-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-vector-code-gen-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/velocity-1.5.jar:/etc/hadoop/hive-2.3.7/lib/hive-llap-tez-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-llap-client-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-llap-common-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/ST4-4.0.4.jar:/etc/hadoop/hive-2.3.7/lib/ivy-2.4.0.jar:/etc/hadoop/hive-2.3.7/lib/groovy-all-2.4.4.jar:/etc/hadoop/hive-2.3.7/lib/eigenbase-properties-1.1.5.jar:/etc/hadoop/hive-2.3.7/lib/janino-2.7.6.jar:/etc/hadoop/hive-2.3.7/lib/commons-compiler-2.7.6.jar:/etc/hadoop/hive-2.3.7/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:/etc/hadoop/hive-2.3.7/lib/stax-api-1.0.1.jar:/etc/hadoop/hive-2.3.7/lib/hive-service-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-llap-server-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/slider-core-0.90.2-incubating.jar:/etc/hadoop/hive-2.3.7/lib/jcommander-1.32.jar:/etc/hadoop/hive-2.3.7/lib/hive-llap-common-2.3.7-tests.jar:/etc/hadoop/hive-2.3.7/lib/commons-math-2.2.jar:/etc/hadoop/hive-2.3.7/lib/metrics-core-2.2.0.jar:/etc/hadoop/hive-2.3.7/lib/jamon-runtime-2.3.1.jar:/etc/hadoop/hive-2.3.7/lib/disruptor-3.3.0.jar:/etc/hadoop/hive-2.3.7/lib/hive-jdbc-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-beeline-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/super-csv-2.2.0.jar:/etc/hadoop/hive-2.3.7/lib/hive-cli-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-contrib-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/java-util-0.27.10.jar:/etc/hadoop/hive-2.3.7/lib/config-magic-0.9.jar:/etc/hadoop/hive-2.3.7/lib/rhino-1.7R5.jar:/etc/hadoop/hive-2.3.7/lib/json-path-2.1.0.jar:/etc/hadoop/hive-2.3.7/lib/guice-multibindings-4.1.0.jar:/etc/hadoop/hive-2.3.7/lib/airline-0.7.jar:/etc/hadoop/hive-2.3.7/lib/jackson-dataformat-smile-2.4.6.jar:/etc/hadoop/hive-2.3.7/lib/hibernate-validator-5.1.3.Final.jar:/etc/hadoop/hive-2.3.7/lib/validation-api-1.1.0.Final.jar:/etc/hadoop/hive-2.3.7/lib/jboss-logging-3.1.3.GA.jar:/etc/hadoop/hive-2.3.7/lib/classmate-1.0.0.jar:/etc/hadoop/hive-2.3.7/lib/commons-dbcp2-2.0.1.jar:/etc/hadoop/hive-2.3.7/lib/commons-pool2-2.2.jar:/etc/hadoop/hive-2.3.7/lib/javax.el-api-3.0.0.jar:/etc/hadoop/hive-2.3.7/lib/jackson-datatype-guava-2.4.6.jar:/etc/hadoop/hive-2.3.7/lib/jdbi-2.63.1.jar:/etc/hadoop/hive-2.3.7/lib/log4j-jul-2.5.jar:/etc/hadoop/hive-2.3.7/lib/antlr4-runtime-4.5.jar:/etc/hadoop/hive-2.3.7/lib/bytebuffer-collections-0.2.5.jar:/etc/hadoop/hive-2.3.7/lib/extendedset-1.3.10.jar:/etc/hadoop/hive-2.3.7/lib/RoaringBitmap-0.5.18.jar:/etc/hadoop/hive-2.3.7/lib/emitter-0.3.6.jar:/etc/hadoop/hive-2.3.7/lib/http-client-1.0.4.jar:/etc/hadoop/hive-2.3.7/lib/server-metrics-0.2.8.jar:/etc/hadoop/hive-2.3.7/lib/compress-lzf-1.0.3.jar:/etc/hadoop/hive-2.3.7/lib/icu4j-4.8.1.jar:/etc/hadoop/hive-2.3.7/lib/lz4-1.3.0.jar:/etc/hadoop/hive-2.3.7/lib/mapdb-1.0.8.jar:/etc/hadoop/hive-2.3.7/lib/javax.el-3.0.0.jar:/etc/hadoop/hive-2.3.7/lib/curator-x-discovery-2.11.0.jar:/etc/hadoop/hive-2.3.7/lib/jackson-jaxrs-json-provider-2.4.6.jar:/etc/hadoop/hive-2.3.7/lib/jackson-jaxrs-base-2.4.6.jar:/etc/hadoop/hive-2.3.7/lib/jackson-module-jaxb-annotations-2.4.6.jar:/etc/hadoop/hive-2.3.7/lib/jackson-jaxrs-smile-provider-2.4.6.jar:/etc/hadoop/hive-2.3.7/lib/tesla-aether-0.0.5.jar:/etc/hadoop/hive-2.3.7/lib/okhttp-1.0.2.jar:/etc/hadoop/hive-2.3.7/lib/aether-api-0.9.0.M2.jar:/etc/hadoop/hive-2.3.7/lib/aether-spi-0.9.0.M2.jar:/etc/hadoop/hive-2.3.7/lib/aether-util-0.9.0.M2.jar:/etc/hadoop/hive-2.3.7/lib/aether-impl-0.9.0.M2.jar:/etc/hadoop/hive-2.3.7/lib/aether-connector-file-0.9.0.M2.jar:/etc/hadoop/hive-2.3.7/lib/aether-connector-okhttp-0.0.9.jar:/etc/hadoop/hive-2.3.7/lib/wagon-provider-api-2.4.jar:/etc/hadoop/hive-2.3.7/lib/plexus-utils-3.0.15.jar:/etc/hadoop/hive-2.3.7/lib/maven-aether-provider-3.1.1.jar:/etc/hadoop/hive-2.3.7/lib/maven-model-3.1.1.jar:/etc/hadoop/hive-2.3.7/lib/maven-model-builder-3.1.1.jar:/etc/hadoop/hive-2.3.7/lib/plexus-interpolation-1.19.jar:/etc/hadoop/hive-2.3.7/lib/maven-repository-metadata-3.1.1.jar:/etc/hadoop/hive-2.3.7/lib/maven-settings-builder-3.1.1.jar:/etc/hadoop/hive-2.3.7/lib/maven-settings-3.1.1.jar:/etc/hadoop/hive-2.3.7/lib/spymemcached-2.11.7.jar:/etc/hadoop/hive-2.3.7/lib/irc-api-1.0-0014.jar:/etc/hadoop/hive-2.3.7/lib/geoip2-0.4.0.jar:/etc/hadoop/hive-2.3.7/lib/maxminddb-0.2.0.jar:/etc/hadoop/hive-2.3.7/lib/google-http-client-jackson2-1.15.0-rc.jar:/etc/hadoop/hive-2.3.7/lib/mysql-metadata-storage-0.9.2.jar:/etc/hadoop/hive-2.3.7/lib/postgresql-metadata-storage-0.9.2.jar:/etc/hadoop/hive-2.3.7/lib/postgresql-9.4.1208.jre7.jar:/etc/hadoop/hive-2.3.7/lib/hive-jdbc-handler-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-accumulo-handler-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/accumulo-core-1.6.0.jar:/etc/hadoop/hive-2.3.7/lib/accumulo-fate-1.6.0.jar:/etc/hadoop/hive-2.3.7/lib/accumulo-start-1.6.0.jar:/etc/hadoop/hive-2.3.7/lib/commons-vfs2-2.0.jar:/etc/hadoop/hive-2.3.7/lib/maven-scm-api-1.4.jar:/etc/hadoop/hive-2.3.7/lib/maven-scm-provider-svnexe-1.4.jar:/etc/hadoop/hive-2.3.7/lib/maven-scm-provider-svn-commons-1.4.jar:/etc/hadoop/hive-2.3.7/lib/regexp-1.3.jar:/etc/hadoop/hive-2.3.7/lib/accumulo-trace-1.6.0.jar:/etc/hadoop/hive-2.3.7/lib/hive-llap-ext-client-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-hplsql-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/org.abego.treelayout.core-1.0.1.jar:/etc/hadoop/hive-2.3.7/lib/hive-hcatalog-core-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-hcatalog-server-extensions-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/mysql-connector-java-5.1.47.jar:/etc/hadoop/hive-2.3.7/hcatalog/share/hcatalog/hive-hcatalog-core-2.3.7.jar::/etc/hadoop/hadoop-3.2.0/etc/hadoop:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/accessors-smart-1.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/asm-5.0.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/audience-annotations-0.5.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/avro-1.7.7.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-beanutils-1.9.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-codec-1.11.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-collections-3.2.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-configuration2-2.1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-io-2.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-lang3-3.7.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-logging-1.1.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-math3-3.1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-net-3.6.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-text-1.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/curator-client-2.12.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/curator-framework-2.12.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/curator-recipes-2.12.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/dnsjava-2.1.7.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/gson-2.2.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/hadoop-annotations-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/hadoop-auth-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/htrace-core4-4.1.0-incubating.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/httpclient-4.5.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/httpcore-4.4.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jackson-annotations-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jackson-core-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jackson-databind-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/javax.servlet-api-3.1.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jaxb-api-2.2.11.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jcip-annotations-1.0-1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jersey-core-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jersey-json-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jersey-server-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jersey-servlet-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jettison-1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-http-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-io-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-security-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-server-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-servlet-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-util-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-webapp-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-xml-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jsch-0.1.54.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/json-smart-2.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jsr305-3.0.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jsr311-api-1.1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-admin-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-client-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-common-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-core-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-crypto-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-identity-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-server-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-simplekdc-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-util-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerby-asn1-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerby-config-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerby-pkix-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerby-util-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerby-xdr-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/netty-3.10.5.Final.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/nimbus-jose-jwt-4.41.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/re2j-1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/slf4j-api-1.7.25.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/snappy-java-1.0.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/stax2-api-3.1.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/token-provider-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/woodstox-core-5.0.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/xz-1.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/zookeeper-3.4.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jul-to-slf4j-1.7.25.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/metrics-core-3.2.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/aws-java-sdk-bundle-1.11.375.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/hadoop-aws-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/hadoop-common-3.2.0-tests.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/hadoop-common-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/hadoop-nfs-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/hadoop-kms-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-util-ajax-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/netty-all-4.0.52.Final.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/okhttp-2.7.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/okio-1.6.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/hadoop-auth-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-codec-1.11.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/httpclient-4.5.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/httpcore-4.4.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/nimbus-jose-jwt-4.41.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jcip-annotations-1.0-1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/json-smart-2.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/accessors-smart-1.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/asm-5.0.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/zookeeper-3.4.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/audience-annotations-0.5.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/netty-3.10.5.Final.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/curator-framework-2.12.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/curator-client-2.12.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-simplekdc-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-client-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerby-config-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-core-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerby-pkix-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerby-asn1-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerby-util-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-common-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-crypto-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-io-2.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-util-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/token-provider-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-admin-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-server-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-identity-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerby-xdr-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jersey-core-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jsr311-api-1.1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jersey-server-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/javax.servlet-api-3.1.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/json-simple-1.1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-server-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-http-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-util-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-io-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-webapp-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-xml-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-servlet-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-security-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/hadoop-annotations-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-math3-3.1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-net-3.6.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-collections-3.2.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jersey-servlet-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jersey-json-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jettison-1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jaxb-impl-2.2.3-1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jaxb-api-2.2.11.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jackson-jaxrs-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jackson-xc-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-beanutils-1.9.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-configuration2-2.1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-lang3-3.7.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-text-1.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/avro-1.7.7.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/paranamer-2.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/snappy-java-1.0.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-compress-1.4.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/xz-1.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/re2j-1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/gson-2.2.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jsch-0.1.54.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/curator-recipes-2.12.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/htrace-core4-4.1.0-incubating.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jackson-databind-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jackson-annotations-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jackson-core-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/stax2-api-3.1.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/woodstox-core-5.0.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/dnsjava-2.1.7.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-3.2.0-tests.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-client-3.2.0-tests.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-client-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-native-client-3.2.0-tests.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-native-client-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-rbf-3.2.0-tests.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-rbf-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-httpfs-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/lib/junit-4.11.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.2.0-tests.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-nativetask-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-uploader-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/HikariCP-java7-2.4.12.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/ehcache-3.3.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/fst-2.50.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/guice-4.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/guice-servlet-4.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/jackson-jaxrs-base-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/jackson-jaxrs-json-provider-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/jackson-module-jaxb-annotations-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/java-util-1.9.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/jersey-client-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/jersey-guice-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/json-io-2.5.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/metrics-core-3.2.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/mssql-jdbc-6.2.1.jre7.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/objenesis-1.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/snakeyaml-1.16.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/swagger-annotations-1.5.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-api-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-client-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-common-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-registry-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-common-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-router-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-timeline-pluginstorage-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-services-api-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-services-core-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-submarine-3.2.0.jar: 2021-10-31 06:37:40,937 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib 2021-10-31 06:37:40,937 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:java.io.tmpdir=/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp 2021-10-31 06:37:40,937 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:java.compiler= 2021-10-31 06:37:40,937 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:os.name=Linux 2021-10-31 06:37:40,937 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:os.arch=amd64 2021-10-31 06:37:40,937 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:os.version=4.14.248-189.473.amzn2.x86_64 2021-10-31 06:37:40,937 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:user.name=ec2-user 2021-10-31 06:37:40,937 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:user.home=/home/ec2-user 2021-10-31 06:37:40,937 INFO [Thread-5] server.ZooKeeperServer:100 : Server environment:user.dir=/etc/hadoop 2021-10-31 06:37:40,943 INFO [Thread-5] server.ZooKeeperServer:755 : tickTime set to 3000 2021-10-31 06:37:40,943 INFO [Thread-5] server.ZooKeeperServer:764 : minSessionTimeout set to -1 2021-10-31 06:37:40,943 INFO [Thread-5] server.ZooKeeperServer:773 : maxSessionTimeout set to -1 2021-10-31 06:37:40,957 INFO [Thread-5] server.NIOServerCnxnFactory:94 : binding to port 0.0.0.0/0.0.0.0:12181 2021-10-31 06:37:41,985 ERROR [localhost-startStop-1] util.ZKUtil:132 : Started zk testing server. 2021-10-31 06:37:41,991 INFO [localhost-startStop-1] util.ZKUtil:181 : zookeeper connection string: localhost:12181 with namespace /kylin/kylin4 2021-10-31 06:37:42,055 INFO [localhost-startStop-1] imps.CuratorFrameworkImpl:235 : Starting 2021-10-31 06:37:42,061 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT 2021-10-31 06:37:42,061 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:host.name=ip-192-168-0-29.eu-west-1.compute.internal 2021-10-31 06:37:42,062 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:java.version=1.8.0-292 2021-10-31 06:37:42,062 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:java.vendor=OpenLogic-OpenJDK 2021-10-31 06:37:42,062 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:java.home=/usr/local/java/openlogic-openjdk-8u292-b10-linux-x64/jre 2021-10-31 06:37:42,062 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:java.class.path=/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/bin/bootstrap.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/bin/tomcat-juli.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/jasper.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-i18n-ru.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/ecj-4.4.2.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-dbcp.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-i18n-de.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-i18n-es.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/jsp-api.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/jasper-el.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-util.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-jdbc.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/websocket-api.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/kylin-spark-classloader-4.0.0.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-api.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/el-api.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/catalina.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/catalina-ha.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/annotations-api.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-i18n-fr.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-i18n-ko.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-i18n-zh-CN.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/catalina-ant.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-i18n-ja.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/servlet-api.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat7-websocket.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/catalina-tribes.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/lib/tomcat-coyote.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/conf:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/lib/kylin-parquet-job-4.0.0.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/lib/kylin-jdbc-4.0.0.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/lib/mysql-connector-java-5.1.47.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/ext/mysql-connector-java-5.1.47.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/ext/log4j-1.2.17.jar:/etc/hadoop/apache-kylin-4.0.0-bin-spark3/ext/slf4j-log4j12-1.7.30.jar:/etc/hadoop/hive-2.3.7/conf:/etc/hadoop/hive-2.3.7/lib/hive-common-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-shims-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-shims-common-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/log4j-api-2.6.2.jar:/etc/hadoop/hive-2.3.7/lib/guava-14.0.1.jar:/etc/hadoop/hive-2.3.7/lib/commons-lang-2.6.jar:/etc/hadoop/hive-2.3.7/lib/libthrift-0.9.3.jar:/etc/hadoop/hive-2.3.7/lib/httpclient-4.4.jar:/etc/hadoop/hive-2.3.7/lib/httpcore-4.4.jar:/etc/hadoop/hive-2.3.7/lib/commons-logging-1.2.jar:/etc/hadoop/hive-2.3.7/lib/commons-codec-1.4.jar:/etc/hadoop/hive-2.3.7/lib/curator-framework-2.7.1.jar:/etc/hadoop/hive-2.3.7/lib/curator-client-2.7.1.jar:/etc/hadoop/hive-2.3.7/lib/zookeeper-3.4.6.jar:/etc/hadoop/hive-2.3.7/lib/jline-2.12.jar:/etc/hadoop/hive-2.3.7/lib/netty-3.6.2.Final.jar:/etc/hadoop/hive-2.3.7/lib/hive-shims-0.23-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/javax.inject-1.jar:/etc/hadoop/hive-2.3.7/lib/protobuf-java-2.5.0.jar:/etc/hadoop/hive-2.3.7/lib/commons-io-2.4.jar:/etc/hadoop/hive-2.3.7/lib/jettison-1.1.jar:/etc/hadoop/hive-2.3.7/lib/activation-1.1.jar:/etc/hadoop/hive-2.3.7/lib/jackson-jaxrs-1.9.13.jar:/etc/hadoop/hive-2.3.7/lib/jackson-xc-1.9.13.jar:/etc/hadoop/hive-2.3.7/lib/jersey-guice-1.19.jar:/etc/hadoop/hive-2.3.7/lib/jersey-server-1.14.jar:/etc/hadoop/hive-2.3.7/lib/asm-3.1.jar:/etc/hadoop/hive-2.3.7/lib/commons-compress-1.9.jar:/etc/hadoop/hive-2.3.7/lib/jersey-client-1.9.jar:/etc/hadoop/hive-2.3.7/lib/commons-cli-1.2.jar:/etc/hadoop/hive-2.3.7/lib/commons-collections-3.2.2.jar:/etc/hadoop/hive-2.3.7/lib/hive-shims-scheduler-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-storage-api-2.4.0.jar:/etc/hadoop/hive-2.3.7/lib/commons-lang3-3.1.jar:/etc/hadoop/hive-2.3.7/lib/orc-core-1.3.4.jar:/etc/hadoop/hive-2.3.7/lib/aircompressor-0.8.jar:/etc/hadoop/hive-2.3.7/lib/slice-0.29.jar:/etc/hadoop/hive-2.3.7/lib/jol-core-0.2.jar:/etc/hadoop/hive-2.3.7/lib/geronimo-jta_1.1_spec-1.1.1.jar:/etc/hadoop/hive-2.3.7/lib/mail-1.4.1.jar:/etc/hadoop/hive-2.3.7/lib/geronimo-jaspic_1.0_spec-1.0.jar:/etc/hadoop/hive-2.3.7/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/etc/hadoop/hive-2.3.7/lib/asm-commons-3.1.jar:/etc/hadoop/hive-2.3.7/lib/asm-tree-3.1.jar:/etc/hadoop/hive-2.3.7/lib/joda-time-2.8.1.jar:/etc/hadoop/hive-2.3.7/lib/log4j-1.2-api-2.6.2.jar:/etc/hadoop/hive-2.3.7/lib/log4j-core-2.6.2.jar:/etc/hadoop/hive-2.3.7/lib/log4j-web-2.6.2.jar:/etc/hadoop/hive-2.3.7/lib/ant-1.9.1.jar:/etc/hadoop/hive-2.3.7/lib/ant-launcher-1.9.1.jar:/etc/hadoop/hive-2.3.7/lib/json-1.8.jar:/etc/hadoop/hive-2.3.7/lib/metrics-core-3.1.0.jar:/etc/hadoop/hive-2.3.7/lib/metrics-jvm-3.1.0.jar:/etc/hadoop/hive-2.3.7/lib/metrics-json-3.1.0.jar:/etc/hadoop/hive-2.3.7/lib/jackson-databind-2.6.5.jar:/etc/hadoop/hive-2.3.7/lib/jackson-annotations-2.6.0.jar:/etc/hadoop/hive-2.3.7/lib/jackson-core-2.6.5.jar:/etc/hadoop/hive-2.3.7/lib/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:/etc/hadoop/hive-2.3.7/lib/commons-math3-3.6.1.jar:/etc/hadoop/hive-2.3.7/lib/commons-httpclient-3.0.1.jar:/etc/hadoop/hive-2.3.7/lib/avro-1.7.7.jar:/etc/hadoop/hive-2.3.7/lib/paranamer-2.3.jar:/etc/hadoop/hive-2.3.7/lib/snappy-java-1.0.5.jar:/etc/hadoop/hive-2.3.7/lib/gson-2.2.4.jar:/etc/hadoop/hive-2.3.7/lib/curator-recipes-2.7.1.jar:/etc/hadoop/hive-2.3.7/lib/jsr305-3.0.0.jar:/etc/hadoop/hive-2.3.7/lib/htrace-core-3.1.0-incubating.jar:/etc/hadoop/hive-2.3.7/lib/hive-serde-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-service-rpc-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/jasper-compiler-5.5.23.jar:/etc/hadoop/hive-2.3.7/lib/ant-1.6.5.jar:/etc/hadoop/hive-2.3.7/lib/jasper-runtime-5.5.23.jar:/etc/hadoop/hive-2.3.7/lib/commons-el-1.0.jar:/etc/hadoop/hive-2.3.7/lib/libfb303-0.9.3.jar:/etc/hadoop/hive-2.3.7/lib/opencsv-2.3.jar:/etc/hadoop/hive-2.3.7/lib/parquet-hadoop-bundle-1.8.1.jar:/etc/hadoop/hive-2.3.7/lib/hive-metastore-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/javolution-5.5.1.jar:/etc/hadoop/hive-2.3.7/lib/findbugs-annotations-1.3.9-1.jar:/etc/hadoop/hive-2.3.7/lib/netty-all-4.0.52.Final.jar:/etc/hadoop/hive-2.3.7/lib/jcodings-1.0.8.jar:/etc/hadoop/hive-2.3.7/lib/joni-2.1.2.jar:/etc/hadoop/hive-2.3.7/lib/bonecp-0.8.0.RELEASE.jar:/etc/hadoop/hive-2.3.7/lib/HikariCP-2.5.1.jar:/etc/hadoop/hive-2.3.7/lib/datanucleus-api-jdo-4.2.4.jar:/etc/hadoop/hive-2.3.7/lib/datanucleus-core-4.1.17.jar:/etc/hadoop/hive-2.3.7/lib/datanucleus-rdbms-4.1.19.jar:/etc/hadoop/hive-2.3.7/lib/commons-pool-1.5.4.jar:/etc/hadoop/hive-2.3.7/lib/commons-dbcp-1.4.jar:/etc/hadoop/hive-2.3.7/lib/jdo-api-3.0.1.jar:/etc/hadoop/hive-2.3.7/lib/jta-1.1.jar:/etc/hadoop/hive-2.3.7/lib/jpam-1.1.jar:/etc/hadoop/hive-2.3.7/lib/javax.jdo-3.2.0-m3.jar:/etc/hadoop/hive-2.3.7/lib/transaction-api-1.1.jar:/etc/hadoop/hive-2.3.7/lib/antlr-runtime-3.5.2.jar:/etc/hadoop/hive-2.3.7/lib/hive-testutils-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/tempus-fugit-1.1.jar:/etc/hadoop/hive-2.3.7/lib/hive-exec-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-vector-code-gen-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/velocity-1.5.jar:/etc/hadoop/hive-2.3.7/lib/hive-llap-tez-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-llap-client-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-llap-common-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/ST4-4.0.4.jar:/etc/hadoop/hive-2.3.7/lib/ivy-2.4.0.jar:/etc/hadoop/hive-2.3.7/lib/groovy-all-2.4.4.jar:/etc/hadoop/hive-2.3.7/lib/eigenbase-properties-1.1.5.jar:/etc/hadoop/hive-2.3.7/lib/janino-2.7.6.jar:/etc/hadoop/hive-2.3.7/lib/commons-compiler-2.7.6.jar:/etc/hadoop/hive-2.3.7/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:/etc/hadoop/hive-2.3.7/lib/stax-api-1.0.1.jar:/etc/hadoop/hive-2.3.7/lib/hive-service-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-llap-server-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/slider-core-0.90.2-incubating.jar:/etc/hadoop/hive-2.3.7/lib/jcommander-1.32.jar:/etc/hadoop/hive-2.3.7/lib/hive-llap-common-2.3.7-tests.jar:/etc/hadoop/hive-2.3.7/lib/commons-math-2.2.jar:/etc/hadoop/hive-2.3.7/lib/metrics-core-2.2.0.jar:/etc/hadoop/hive-2.3.7/lib/jamon-runtime-2.3.1.jar:/etc/hadoop/hive-2.3.7/lib/disruptor-3.3.0.jar:/etc/hadoop/hive-2.3.7/lib/hive-jdbc-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-beeline-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/super-csv-2.2.0.jar:/etc/hadoop/hive-2.3.7/lib/hive-cli-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-contrib-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/java-util-0.27.10.jar:/etc/hadoop/hive-2.3.7/lib/config-magic-0.9.jar:/etc/hadoop/hive-2.3.7/lib/rhino-1.7R5.jar:/etc/hadoop/hive-2.3.7/lib/json-path-2.1.0.jar:/etc/hadoop/hive-2.3.7/lib/guice-multibindings-4.1.0.jar:/etc/hadoop/hive-2.3.7/lib/airline-0.7.jar:/etc/hadoop/hive-2.3.7/lib/jackson-dataformat-smile-2.4.6.jar:/etc/hadoop/hive-2.3.7/lib/hibernate-validator-5.1.3.Final.jar:/etc/hadoop/hive-2.3.7/lib/validation-api-1.1.0.Final.jar:/etc/hadoop/hive-2.3.7/lib/jboss-logging-3.1.3.GA.jar:/etc/hadoop/hive-2.3.7/lib/classmate-1.0.0.jar:/etc/hadoop/hive-2.3.7/lib/commons-dbcp2-2.0.1.jar:/etc/hadoop/hive-2.3.7/lib/commons-pool2-2.2.jar:/etc/hadoop/hive-2.3.7/lib/javax.el-api-3.0.0.jar:/etc/hadoop/hive-2.3.7/lib/jackson-datatype-guava-2.4.6.jar:/etc/hadoop/hive-2.3.7/lib/jdbi-2.63.1.jar:/etc/hadoop/hive-2.3.7/lib/log4j-jul-2.5.jar:/etc/hadoop/hive-2.3.7/lib/antlr4-runtime-4.5.jar:/etc/hadoop/hive-2.3.7/lib/bytebuffer-collections-0.2.5.jar:/etc/hadoop/hive-2.3.7/lib/extendedset-1.3.10.jar:/etc/hadoop/hive-2.3.7/lib/RoaringBitmap-0.5.18.jar:/etc/hadoop/hive-2.3.7/lib/emitter-0.3.6.jar:/etc/hadoop/hive-2.3.7/lib/http-client-1.0.4.jar:/etc/hadoop/hive-2.3.7/lib/server-metrics-0.2.8.jar:/etc/hadoop/hive-2.3.7/lib/compress-lzf-1.0.3.jar:/etc/hadoop/hive-2.3.7/lib/icu4j-4.8.1.jar:/etc/hadoop/hive-2.3.7/lib/lz4-1.3.0.jar:/etc/hadoop/hive-2.3.7/lib/mapdb-1.0.8.jar:/etc/hadoop/hive-2.3.7/lib/javax.el-3.0.0.jar:/etc/hadoop/hive-2.3.7/lib/curator-x-discovery-2.11.0.jar:/etc/hadoop/hive-2.3.7/lib/jackson-jaxrs-json-provider-2.4.6.jar:/etc/hadoop/hive-2.3.7/lib/jackson-jaxrs-base-2.4.6.jar:/etc/hadoop/hive-2.3.7/lib/jackson-module-jaxb-annotations-2.4.6.jar:/etc/hadoop/hive-2.3.7/lib/jackson-jaxrs-smile-provider-2.4.6.jar:/etc/hadoop/hive-2.3.7/lib/tesla-aether-0.0.5.jar:/etc/hadoop/hive-2.3.7/lib/okhttp-1.0.2.jar:/etc/hadoop/hive-2.3.7/lib/aether-api-0.9.0.M2.jar:/etc/hadoop/hive-2.3.7/lib/aether-spi-0.9.0.M2.jar:/etc/hadoop/hive-2.3.7/lib/aether-util-0.9.0.M2.jar:/etc/hadoop/hive-2.3.7/lib/aether-impl-0.9.0.M2.jar:/etc/hadoop/hive-2.3.7/lib/aether-connector-file-0.9.0.M2.jar:/etc/hadoop/hive-2.3.7/lib/aether-connector-okhttp-0.0.9.jar:/etc/hadoop/hive-2.3.7/lib/wagon-provider-api-2.4.jar:/etc/hadoop/hive-2.3.7/lib/plexus-utils-3.0.15.jar:/etc/hadoop/hive-2.3.7/lib/maven-aether-provider-3.1.1.jar:/etc/hadoop/hive-2.3.7/lib/maven-model-3.1.1.jar:/etc/hadoop/hive-2.3.7/lib/maven-model-builder-3.1.1.jar:/etc/hadoop/hive-2.3.7/lib/plexus-interpolation-1.19.jar:/etc/hadoop/hive-2.3.7/lib/maven-repository-metadata-3.1.1.jar:/etc/hadoop/hive-2.3.7/lib/maven-settings-builder-3.1.1.jar:/etc/hadoop/hive-2.3.7/lib/maven-settings-3.1.1.jar:/etc/hadoop/hive-2.3.7/lib/spymemcached-2.11.7.jar:/etc/hadoop/hive-2.3.7/lib/irc-api-1.0-0014.jar:/etc/hadoop/hive-2.3.7/lib/geoip2-0.4.0.jar:/etc/hadoop/hive-2.3.7/lib/maxminddb-0.2.0.jar:/etc/hadoop/hive-2.3.7/lib/google-http-client-jackson2-1.15.0-rc.jar:/etc/hadoop/hive-2.3.7/lib/mysql-metadata-storage-0.9.2.jar:/etc/hadoop/hive-2.3.7/lib/postgresql-metadata-storage-0.9.2.jar:/etc/hadoop/hive-2.3.7/lib/postgresql-9.4.1208.jre7.jar:/etc/hadoop/hive-2.3.7/lib/hive-jdbc-handler-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-accumulo-handler-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/accumulo-core-1.6.0.jar:/etc/hadoop/hive-2.3.7/lib/accumulo-fate-1.6.0.jar:/etc/hadoop/hive-2.3.7/lib/accumulo-start-1.6.0.jar:/etc/hadoop/hive-2.3.7/lib/commons-vfs2-2.0.jar:/etc/hadoop/hive-2.3.7/lib/maven-scm-api-1.4.jar:/etc/hadoop/hive-2.3.7/lib/maven-scm-provider-svnexe-1.4.jar:/etc/hadoop/hive-2.3.7/lib/maven-scm-provider-svn-commons-1.4.jar:/etc/hadoop/hive-2.3.7/lib/regexp-1.3.jar:/etc/hadoop/hive-2.3.7/lib/accumulo-trace-1.6.0.jar:/etc/hadoop/hive-2.3.7/lib/hive-llap-ext-client-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-hplsql-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/org.abego.treelayout.core-1.0.1.jar:/etc/hadoop/hive-2.3.7/lib/hive-hcatalog-core-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/hive-hcatalog-server-extensions-2.3.7.jar:/etc/hadoop/hive-2.3.7/lib/mysql-connector-java-5.1.47.jar:/etc/hadoop/hive-2.3.7/hcatalog/share/hcatalog/hive-hcatalog-core-2.3.7.jar::/etc/hadoop/hadoop-3.2.0/etc/hadoop:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/accessors-smart-1.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/asm-5.0.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/audience-annotations-0.5.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/avro-1.7.7.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-beanutils-1.9.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-codec-1.11.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-collections-3.2.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-configuration2-2.1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-io-2.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-lang3-3.7.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-logging-1.1.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-math3-3.1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-net-3.6.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/commons-text-1.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/curator-client-2.12.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/curator-framework-2.12.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/curator-recipes-2.12.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/dnsjava-2.1.7.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/gson-2.2.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/hadoop-annotations-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/hadoop-auth-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/htrace-core4-4.1.0-incubating.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/httpclient-4.5.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/httpcore-4.4.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jackson-annotations-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jackson-core-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jackson-databind-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/javax.servlet-api-3.1.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jaxb-api-2.2.11.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jcip-annotations-1.0-1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jersey-core-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jersey-json-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jersey-server-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jersey-servlet-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jettison-1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-http-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-io-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-security-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-server-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-servlet-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-util-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-webapp-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jetty-xml-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jsch-0.1.54.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/json-smart-2.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jsr305-3.0.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jsr311-api-1.1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-admin-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-client-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-common-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-core-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-crypto-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-identity-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-server-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-simplekdc-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerb-util-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerby-asn1-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerby-config-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerby-pkix-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerby-util-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/kerby-xdr-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/netty-3.10.5.Final.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/nimbus-jose-jwt-4.41.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/re2j-1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/slf4j-api-1.7.25.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/snappy-java-1.0.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/stax2-api-3.1.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/token-provider-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/woodstox-core-5.0.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/xz-1.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/zookeeper-3.4.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/jul-to-slf4j-1.7.25.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/metrics-core-3.2.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/aws-java-sdk-bundle-1.11.375.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/lib/hadoop-aws-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/hadoop-common-3.2.0-tests.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/hadoop-common-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/hadoop-nfs-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/common/hadoop-kms-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-util-ajax-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/netty-all-4.0.52.Final.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/okhttp-2.7.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/okio-1.6.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/hadoop-auth-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-codec-1.11.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/httpclient-4.5.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/httpcore-4.4.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/nimbus-jose-jwt-4.41.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jcip-annotations-1.0-1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/json-smart-2.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/accessors-smart-1.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/asm-5.0.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/zookeeper-3.4.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/audience-annotations-0.5.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/netty-3.10.5.Final.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/curator-framework-2.12.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/curator-client-2.12.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-simplekdc-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-client-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerby-config-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-core-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerby-pkix-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerby-asn1-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerby-util-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-common-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-crypto-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-io-2.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-util-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/token-provider-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-admin-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-server-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerb-identity-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/kerby-xdr-1.0.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jersey-core-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jsr311-api-1.1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jersey-server-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/javax.servlet-api-3.1.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/json-simple-1.1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-server-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-http-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-util-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-io-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-webapp-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-xml-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-servlet-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jetty-security-9.3.24.v20180605.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/hadoop-annotations-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-math3-3.1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-net-3.6.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-collections-3.2.2.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jersey-servlet-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jersey-json-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jettison-1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jaxb-impl-2.2.3-1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jaxb-api-2.2.11.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jackson-jaxrs-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jackson-xc-1.9.13.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-beanutils-1.9.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-configuration2-2.1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-lang3-3.7.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-text-1.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/avro-1.7.7.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/paranamer-2.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/snappy-java-1.0.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/commons-compress-1.4.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/xz-1.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/re2j-1.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/gson-2.2.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jsch-0.1.54.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/curator-recipes-2.12.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/htrace-core4-4.1.0-incubating.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jackson-databind-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jackson-annotations-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/jackson-core-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/stax2-api-3.1.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/woodstox-core-5.0.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/lib/dnsjava-2.1.7.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-3.2.0-tests.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-client-3.2.0-tests.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-client-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-native-client-3.2.0-tests.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-native-client-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-rbf-3.2.0-tests.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-rbf-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/hdfs/hadoop-hdfs-httpfs-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/lib/junit-4.11.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.2.0-tests.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-nativetask-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-uploader-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/HikariCP-java7-2.4.12.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/ehcache-3.3.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/fst-2.50.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/guice-4.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/guice-servlet-4.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/jackson-jaxrs-base-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/jackson-jaxrs-json-provider-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/jackson-module-jaxb-annotations-2.9.5.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/java-util-1.9.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/jersey-client-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/jersey-guice-1.19.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/json-io-2.5.1.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/metrics-core-3.2.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/mssql-jdbc-6.2.1.jre7.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/objenesis-1.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/snakeyaml-1.16.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/lib/swagger-annotations-1.5.4.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-api-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-client-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-common-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-registry-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-common-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-router-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-timeline-pluginstorage-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-services-api-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-services-core-3.2.0.jar:/etc/hadoop/hadoop-3.2.0/share/hadoop/yarn/hadoop-yarn-submarine-3.2.0.jar: 2021-10-31 06:37:42,062 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib 2021-10-31 06:37:42,063 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:java.io.tmpdir=/etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp 2021-10-31 06:37:42,063 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:java.compiler= 2021-10-31 06:37:42,063 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:os.name=Linux 2021-10-31 06:37:42,063 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:os.arch=amd64 2021-10-31 06:37:42,063 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:os.version=4.14.248-189.473.amzn2.x86_64 2021-10-31 06:37:42,063 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:user.name=ec2-user 2021-10-31 06:37:42,063 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:user.home=/home/ec2-user 2021-10-31 06:37:42,064 INFO [localhost-startStop-1] zookeeper.ZooKeeper:100 : Client environment:user.dir=/etc/hadoop 2021-10-31 06:37:42,064 INFO [localhost-startStop-1] zookeeper.ZooKeeper:438 : Initiating client connection, connectString=localhost:12181/kylin/kylin4 sessionTimeout=120000 watcher=org.apache.curator.ConnectionState@26f52eba 2021-10-31 06:37:42,095 INFO [localhost-startStop-1-SendThread(localhost:12181)] zookeeper.ClientCnxn:975 : Opening socket connection to server localhost/127.0.0.1:12181. Will not attempt to authenticate using SASL (unknown error) 2021-10-31 06:37:42,100 INFO [localhost-startStop-1] util.ZKUtil:185 : new zookeeper Client start: localhost:12181 2021-10-31 06:37:42,101 INFO [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:12181] server.NIOServerCnxnFactory:197 : Accepted socket connection from /127.0.0.1:56314 2021-10-31 06:37:42,101 INFO [localhost-startStop-1-SendThread(localhost:12181)] zookeeper.ClientCnxn:852 : Socket connection established to localhost/127.0.0.1:12181, initiating session 2021-10-31 06:37:42,110 INFO [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:12181] server.ZooKeeperServer:868 : Client attempting to establish new session at /127.0.0.1:56314 2021-10-31 06:37:42,113 INFO [SyncThread:0] persistence.FileTxnLog:199 : Creating new log file: log.1 2021-10-31 06:37:42,120 INFO [SyncThread:0] server.ZooKeeperServer:617 : Established session 0x17cd5116aed0000 with negotiated timeout 60000 for client /127.0.0.1:56314 2021-10-31 06:37:42,121 INFO [localhost-startStop-1-SendThread(localhost:12181)] zookeeper.ClientCnxn:1235 : Session establishment complete on server localhost/127.0.0.1:12181, sessionid = 0x17cd5116aed0000, negotiated timeout = 60000 2021-10-31 06:37:42,126 INFO [localhost-startStop-1-EventThread] state.ConnectionStateManager:228 : State change: CONNECTED 2021-10-31 06:37:42,138 INFO [localhost-startStop-1] imps.CuratorFrameworkImpl:235 : Starting 2021-10-31 06:37:42,138 INFO [localhost-startStop-1] zookeeper.ZooKeeper:438 : Initiating client connection, connectString=localhost:12181 sessionTimeout=120000 watcher=org.apache.curator.ConnectionState@1c22ef08 2021-10-31 06:37:42,142 INFO [localhost-startStop-1-SendThread(localhost:12181)] zookeeper.ClientCnxn:975 : Opening socket connection to server localhost/127.0.0.1:12181. Will not attempt to authenticate using SASL (unknown error) 2021-10-31 06:37:42,142 INFO [localhost-startStop-1-SendThread(localhost:12181)] zookeeper.ClientCnxn:852 : Socket connection established to localhost/127.0.0.1:12181, initiating session 2021-10-31 06:37:42,142 INFO [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:12181] server.NIOServerCnxnFactory:197 : Accepted socket connection from /127.0.0.1:56316 2021-10-31 06:37:42,143 INFO [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:12181] server.ZooKeeperServer:868 : Client attempting to establish new session at /127.0.0.1:56316 2021-10-31 06:37:42,144 INFO [SyncThread:0] server.ZooKeeperServer:617 : Established session 0x17cd5116aed0001 with negotiated timeout 60000 for client /127.0.0.1:56316 2021-10-31 06:37:42,144 INFO [localhost-startStop-1-SendThread(localhost:12181)] zookeeper.ClientCnxn:1235 : Session establishment complete on server localhost/127.0.0.1:12181, sessionid = 0x17cd5116aed0001, negotiated timeout = 60000 2021-10-31 06:37:42,145 INFO [localhost-startStop-1-EventThread] state.ConnectionStateManager:228 : State change: CONNECTED 2021-10-31 06:37:42,152 INFO [ProcessThread(sid:0 cport:-1):] server.PrepRequestProcessor:645 : Got user-level KeeperException when processing sessionid:0x17cd5116aed0001 type:create cxid:0x1 zxid:0x3 txntype:-1 reqpath:n/a Error Path:/kylin Error:KeeperErrorCode = NoNode for /kylin 2021-10-31 06:37:42,158 INFO [Curator-Framework-0] imps.CuratorFrameworkImpl:821 : backgroundOperationsLoop exiting 2021-10-31 06:37:42,161 INFO [ProcessThread(sid:0 cport:-1):] server.PrepRequestProcessor:494 : Processed session termination for sessionid: 0x17cd5116aed0001 2021-10-31 06:37:42,162 INFO [localhost-startStop-1] zookeeper.ZooKeeper:684 : Session: 0x17cd5116aed0001 closed 2021-10-31 06:37:42,163 INFO [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:12181] server.NIOServerCnxn:1007 : Closed socket connection for client /127.0.0.1:56316 which had sessionid 0x17cd5116aed0001 2021-10-31 06:37:42,163 INFO [localhost-startStop-1-EventThread] zookeeper.ClientCnxn:512 : EventThread shut down 2021-10-31 06:37:42,166 INFO [localhost-startStop-1] threadpool.DefaultScheduler:141 : Initializing Job Engine .... 2021-10-31 06:37:42,166 DEBUG [localhost-startStop-1] zookeeper.ZookeeperDistributedLock:101 : 2385@ip-192-168-0-29.eu-west-1.compute.internal trying to lock /job_engine/global_job_engine_lock 2021-10-31 06:37:42,167 INFO [ProcessThread(sid:0 cport:-1):] server.PrepRequestProcessor:645 : Got user-level KeeperException when processing sessionid:0x17cd5116aed0000 type:create cxid:0x2 zxid:0x7 txntype:-1 reqpath:n/a Error Path:/kylin/kylin4/job_engine Error:KeeperErrorCode = NoNode for /kylin/kylin4/job_engine 2021-10-31 06:37:42,176 INFO [localhost-startStop-1] zookeeper.ZookeeperDistributedLock:114 : 2385@ip-192-168-0-29.eu-west-1.compute.internal acquired lock at /job_engine/global_job_engine_lock 2021-10-31 06:37:42,177 INFO [localhost-startStop-1] threadpool.DefaultScheduler:162 : Starting resume all running jobs. 2021-10-31 06:37:42,181 INFO [localhost-startStop-1] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.job.execution.ExecutableManager 2021-10-31 06:37:42,182 INFO [localhost-startStop-1] execution.ExecutableManager:92 : Using metadata url: kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:42,185 INFO [localhost-startStop-1] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.job.dao.ExecutableDao 2021-10-31 06:37:42,185 INFO [localhost-startStop-1] dao.ExecutableDao:83 : Using metadata url: kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:42,186 DEBUG [localhost-startStop-1] cachesync.CachedCrudAssist:122 : Reloading ExecutablePO from kylin4(key='/execute')@kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:42,192 DEBUG [localhost-startStop-1] cachesync.CachedCrudAssist:155 : Loaded 1 ExecutablePO(s) out of 1 resource with 0 errors 2021-10-31 06:37:42,193 DEBUG [localhost-startStop-1] dao.ExecutableDao:126 : Reloading execute_output from /execute_output 2021-10-31 06:37:42,203 DEBUG [localhost-startStop-1] dao.ExecutableDao:137 : Loaded 1 execute_output digest(s) out of 3 resource 2021-10-31 06:37:42,215 INFO [localhost-startStop-1] threadpool.DefaultScheduler:165 : Finishing resume all running jobs. 2021-10-31 06:37:42,216 INFO [localhost-startStop-1] threadpool.DefaultScheduler:169 : Fetching jobs every 30 seconds 2021-10-31 06:37:42,217 INFO [localhost-startStop-1] threadpool.DefaultScheduler:179 : Creating fetcher pool instance:1752764464 2021-10-31 06:37:42,333 INFO [localhost-startStop-1] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.metadata.badquery.BadQueryHistoryManager 2021-10-31 06:37:42,333 INFO [localhost-startStop-1] badquery.BadQueryHistoryManager:51 : Initializing BadQueryHistoryManager with config kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:42,935 INFO [localhost-startStop-1] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.rest.security.KylinUserManager 2021-10-31 06:37:42,935 INFO [localhost-startStop-1] security.KylinUserManager:60 : Initializing KylinUserManager with config kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:42,935 DEBUG [localhost-startStop-1] cachesync.CachedCrudAssist:122 : Reloading ManagedUser from kylin4(key='/user')@kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:42,950 DEBUG [localhost-startStop-1] cachesync.CachedCrudAssist:155 : Loaded 3 ManagedUser(s) out of 3 resource with 0 errors 2021-10-31 06:37:43,570 INFO [localhost-startStop-1] web.DefaultSecurityFilterChain:43 : Creating filter chain: org.springframework.security.web.util.matcher.AnyRequestMatcher@1, [org.springframework.security.web.context.SecurityContextPersistenceFilter@42f1d493, org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter@68e10d8f, org.springframework.security.web.header.HeaderWriterFilter@7f7c369, org.springframework.security.web.authentication.logout.LogoutFilter@165c7967, org.springframework.security.web.authentication.UsernamePasswordAuthenticationFilter@5a456273, org.springframework.security.web.authentication.www.BasicAuthenticationFilter@3aa891c1, org.springframework.security.web.savedrequest.RequestCacheAwareFilter@37df176, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter@761f273f, org.springframework.security.web.authentication.AnonymousAuthenticationFilter@7389575, org.springframework.security.web.session.SessionManagementFilter@79603aca, org.springframework.security.web.access.ExceptionTranslationFilter@157a92d1, org.springframework.security.web.access.intercept.FilterSecurityInterceptor@1a53699e] 2021-10-31 06:37:43,609 INFO [localhost-startStop-1] http.DefaultFilterChainValidator:154 : Checking whether login URL '/login' is accessible with your configuration 2021-10-31 06:37:43,962 INFO [http-bio-7070-exec-2] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.metadata.project.ProjectManager 2021-10-31 06:37:43,963 INFO [http-bio-7070-exec-2] project.ProjectManager:81 : Initializing ProjectManager with metadata url kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:43,965 DEBUG [http-bio-7070-exec-2] cachesync.CachedCrudAssist:122 : Reloading ProjectInstance from kylin4(key='/project')@kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:43,980 DEBUG [http-bio-7070-exec-2] cachesync.CachedCrudAssist:155 : Loaded 1 ProjectInstance(s) out of 1 resource with 0 errors 2021-10-31 06:37:44,029 INFO [http-bio-7070-exec-3] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.cube.CubeManager 2021-10-31 06:37:44,033 INFO [http-bio-7070-exec-3] cube.CubeManager:122 : Initializing CubeManager with config kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:44,034 DEBUG [http-bio-7070-exec-3] cachesync.CachedCrudAssist:122 : Reloading CubeInstance from kylin4(key='/cube')@kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:44,066 INFO [http-bio-7070-exec-3] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.cube.CubeDescManager 2021-10-31 06:37:44,067 INFO [http-bio-7070-exec-3] cube.CubeDescManager:91 : Initializing CubeDescManager with config kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:44,067 DEBUG [http-bio-7070-exec-3] cachesync.CachedCrudAssist:122 : Reloading CubeDesc from kylin4(key='/cube_desc')@kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:44,102 INFO [http-bio-7070-exec-3] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.metadata.model.DataModelManager 2021-10-31 06:37:44,106 INFO [http-bio-7070-exec-3] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.metadata.TableMetadataManager 2021-10-31 06:37:44,107 DEBUG [http-bio-7070-exec-3] cachesync.CachedCrudAssist:122 : Reloading TableDesc from kylin4(key='/table')@kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:44,129 DEBUG [http-bio-7070-exec-3] measure.MeasureTypeFactory:146 : registering COUNT_DISTINCT(hllc), class org.apache.kylin.measure.hllc.HLLCMeasureType$Factory 2021-10-31 06:37:44,135 DEBUG [http-bio-7070-exec-3] measure.MeasureTypeFactory:146 : registering COUNT_DISTINCT(bitmap), class org.apache.kylin.measure.bitmap.BitmapMeasureType$Factory 2021-10-31 06:37:44,139 DEBUG [http-bio-7070-exec-3] measure.MeasureTypeFactory:146 : registering TOP_N(topn), class org.apache.kylin.measure.topn.TopNMeasureType$Factory 2021-10-31 06:37:44,141 DEBUG [http-bio-7070-exec-3] measure.MeasureTypeFactory:146 : registering RAW(raw), class org.apache.kylin.measure.raw.RawMeasureType$Factory 2021-10-31 06:37:44,143 DEBUG [http-bio-7070-exec-3] measure.MeasureTypeFactory:146 : registering EXTENDED_COLUMN(extendedcolumn), class org.apache.kylin.measure.extendedcolumn.ExtendedColumnMeasureType$Factory 2021-10-31 06:37:44,144 DEBUG [http-bio-7070-exec-3] measure.MeasureTypeFactory:146 : registering PERCENTILE_APPROX(percentile), class org.apache.kylin.measure.percentile.PercentileMeasureType$Factory 2021-10-31 06:37:44,146 DEBUG [http-bio-7070-exec-3] measure.MeasureTypeFactory:146 : registering COUNT_DISTINCT(dim_dc), class org.apache.kylin.measure.dim.DimCountDistinctMeasureType$Factory 2021-10-31 06:37:44,149 DEBUG [http-bio-7070-exec-3] cachesync.CachedCrudAssist:155 : Loaded 5 TableDesc(s) out of 5 resource with 0 errors 2021-10-31 06:37:44,149 DEBUG [http-bio-7070-exec-3] cachesync.CachedCrudAssist:122 : Reloading TableExtDesc from kylin4(key='/table_exd')@kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:44,159 DEBUG [http-bio-7070-exec-3] cachesync.CachedCrudAssist:155 : Loaded 5 TableExtDesc(s) out of 5 resource with 0 errors 2021-10-31 06:37:44,159 DEBUG [http-bio-7070-exec-3] cachesync.CachedCrudAssist:122 : Reloading ExternalFilterDesc from kylin4(key='/ext_filter')@kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:44,163 DEBUG [http-bio-7070-exec-3] cachesync.CachedCrudAssist:155 : Loaded 0 ExternalFilterDesc(s) out of 0 resource with 0 errors 2021-10-31 06:37:44,164 DEBUG [http-bio-7070-exec-3] cachesync.CachedCrudAssist:122 : Reloading DataModelDesc from kylin4(key='/model_desc')@kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:37:44,184 DEBUG [http-bio-7070-exec-3] cachesync.CachedCrudAssist:155 : Loaded 1 DataModelDesc(s) out of 1 resource with 0 errors 2021-10-31 06:37:44,199 DEBUG [http-bio-7070-exec-3] cachesync.CachedCrudAssist:155 : Loaded 2 CubeDesc(s) out of 2 resource with 0 errors 2021-10-31 06:37:44,200 DEBUG [http-bio-7070-exec-3] cachesync.CachedCrudAssist:155 : Loaded 2 CubeInstance(s) out of 2 resource with 0 errors 2021-10-31 06:37:45,219 INFO [FetcherRunner 1752764464-37] threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 0 actual running, 0 stopped, 0 ready, 0 already succeed, 1 error, 0 discarded, 0 others 2021-10-31 06:37:50,135 INFO [http-bio-7070-exec-8] service.JobService:680 : Cancel job [e85deb60-fab8-4772-9df6-10787934103d] trigger by ADMIN 2021-10-31 06:37:50,312 INFO [http-bio-7070-exec-8] cube.CubeManager:378 : Updating cube instance 'Sample1' 2021-10-31 06:37:50,313 INFO [http-bio-7070-exec-8] cube.CubeManager:482 : Remove segment Sample1[20200401000000_20200530235500] 2021-10-31 06:37:50,313 DEBUG [http-bio-7070-exec-8] cachesync.CachedCrudAssist:227 : Saving CubeInstance at /cube/Sample1.json 2021-10-31 06:37:50,325 DEBUG [pool-3-thread-1] cachesync.Broadcaster:119 : Servers in the cluster: [localhost:7070] 2021-10-31 06:37:50,485 DEBUG [pool-3-thread-1] cachesync.Broadcaster:129 : Announcing new broadcast to all: BroadcastEvent{entity=cube, event=update, cacheKey=Sample1} 2021-10-31 06:37:50,642 DEBUG [http-bio-7070-exec-6] cachesync.Broadcaster:267 : Broadcasting UPDATE, cube, Sample1 2021-10-31 06:37:50,654 WARN [http-bio-7070-exec-8] util.NativeCodeLoader:60 : Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2021-10-31 06:37:50,964 INFO [http-bio-7070-exec-8] impl.MetricsConfig:118 : Loaded properties from hadoop-metrics2.properties 2021-10-31 06:37:50,985 INFO [http-bio-7070-exec-8] impl.MetricsSystemImpl:374 : Scheduled Metric snapshot period at 10 second(s). 2021-10-31 06:37:50,985 INFO [http-bio-7070-exec-8] impl.MetricsSystemImpl:191 : s3a-file-system metrics system started 2021-10-31 06:37:52,096 DEBUG [http-bio-7070-exec-8] persistence.PushdownResourceStore:209 : /fenix/kylin/kylin4/resources-jdbc/cube_statistics/Sample1/9988bc0d-6dac-de07-0b80-7275e38789ed.seq is not exists in the file system. 2021-10-31 06:37:52,102 INFO [http-bio-7070-exec-6] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.cube.cuboid.CuboidManager 2021-10-31 06:37:52,104 DEBUG [http-bio-7070-exec-6] cachesync.Broadcaster:267 : Broadcasting UPDATE, project_data, kylin_s3_first_project 2021-10-31 06:37:52,105 INFO [http-bio-7070-exec-6] service.CacheService:123 : cleaning cache for project kylin_s3_first_project (currently remove all entries) 2021-10-31 06:37:52,107 DEBUG [http-bio-7070-exec-6] cachesync.Broadcaster:301 : Done broadcasting UPDATE, project_data, kylin_s3_first_project 2021-10-31 06:37:52,110 DEBUG [http-bio-7070-exec-6] cachesync.Broadcaster:301 : Done broadcasting UPDATE, cube, Sample1 2021-10-31 06:37:52,111 INFO [http-bio-7070-exec-8] execution.ExecutableManager:676 : job id:e85deb60-fab8-4772-9df6-10787934103d-00 from ERROR to DISCARDED 2021-10-31 06:37:52,128 INFO [http-bio-7070-exec-8] execution.ExecutableManager:676 : job id:e85deb60-fab8-4772-9df6-10787934103d-01 from READY to DISCARDED 2021-10-31 06:37:52,140 INFO [http-bio-7070-exec-8] execution.ExecutableManager:676 : job id:e85deb60-fab8-4772-9df6-10787934103d from ERROR to DISCARDED 2021-10-31 06:37:52,151 DEBUG [pool-3-thread-1] cachesync.Broadcaster:119 : Servers in the cluster: [localhost:7070] 2021-10-31 06:37:52,151 DEBUG [pool-3-thread-1] cachesync.Broadcaster:129 : Announcing new broadcast to all: BroadcastEvent{entity=execute_output, event=update, cacheKey=e85deb60-fab8-4772-9df6-10787934103d} 2021-10-31 06:37:52,159 DEBUG [http-bio-7070-exec-6] cachesync.Broadcaster:267 : Broadcasting UPDATE, execute_output, e85deb60-fab8-4772-9df6-10787934103d 2021-10-31 06:37:52,162 DEBUG [http-bio-7070-exec-6] cachesync.Broadcaster:301 : Done broadcasting UPDATE, execute_output, e85deb60-fab8-4772-9df6-10787934103d 2021-10-31 06:37:52,287 INFO [http-bio-7070-exec-8] cube.PathManager:85 : Deleting job tmp path s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e85deb60-fab8-4772-9df6-10787934103d-00_jobId 2021-10-31 06:37:52,410 INFO [http-bio-7070-exec-8] cube.PathManager:85 : Deleting job tmp path s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e85deb60-fab8-4772-9df6-10787934103d-00 2021-10-31 06:37:52,963 INFO [http-bio-7070-exec-8] cube.PathManager:85 : Deleting job tmp path s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e85deb60-fab8-4772-9df6-10787934103d 2021-10-31 06:37:53,280 INFO [http-bio-7070-exec-8] cube.PathManager:66 : Deleting segment parquet path s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/parquet/Sample1/20200401000000_20200530235500_RFU 2021-10-31 06:37:58,939 DEBUG [pool-3-thread-1] cachesync.Broadcaster:119 : Servers in the cluster: [localhost:7070] 2021-10-31 06:37:58,940 DEBUG [pool-3-thread-1] cachesync.Broadcaster:129 : Announcing new broadcast to all: BroadcastEvent{entity=execute, event=drop, cacheKey=e85deb60-fab8-4772-9df6-10787934103d} 2021-10-31 06:37:58,946 DEBUG [http-bio-7070-exec-6] cachesync.Broadcaster:267 : Broadcasting DROP, execute, e85deb60-fab8-4772-9df6-10787934103d 2021-10-31 06:37:58,946 DEBUG [http-bio-7070-exec-6] cachesync.Broadcaster:301 : Done broadcasting DROP, execute, e85deb60-fab8-4772-9df6-10787934103d 2021-10-31 06:37:58,961 INFO [http-bio-7070-exec-8] service.JobService:799 : Delete job [e85deb60-fab8-4772-9df6-10787934103d] trigger by + ADMIN 2021-10-31 06:38:10,006 DEBUG [http-bio-7070-exec-14] project.ProjectL2Cache:198 : Loading L2 project cache for kylin_s3_first_project 2021-10-31 06:38:10,012 INFO [http-bio-7070-exec-14] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.metadata.realization.RealizationRegistry 2021-10-31 06:38:10,012 INFO [http-bio-7070-exec-14] realization.RealizationRegistry:54 : Initializing RealizationRegistry with metadata url kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:38:10,014 INFO [http-bio-7070-exec-14] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.storage.hybrid.HybridManager 2021-10-31 06:38:10,014 INFO [http-bio-7070-exec-14] hybrid.HybridManager:71 : Initializing HybridManager with config kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:38:10,015 DEBUG [http-bio-7070-exec-14] cachesync.CachedCrudAssist:122 : Reloading HybridInstance from kylin4(key='/hybrid')@kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10 2021-10-31 06:38:10,026 DEBUG [http-bio-7070-exec-14] cachesync.CachedCrudAssist:155 : Loaded 0 HybridInstance(s) out of 0 resource with 0 errors 2021-10-31 06:38:10,026 INFO [http-bio-7070-exec-14] realization.RealizationRegistry:81 : RealizationRegistry is {CUBE=org.apache.kylin.cube.CubeManager@fb95001, HYBRID=org.apache.kylin.storage.hybrid.HybridManager@5d8edd36} 2021-10-31 06:38:15,217 INFO [FetcherRunner 1752764464-37] threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 0 actual running, 0 stopped, 0 ready, 0 already succeed, 0 error, 0 discarded, 0 others 2021-10-31 06:38:37,130 INFO [http-bio-7070-exec-4] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.source.SourceManager 2021-10-31 06:38:37,135 INFO [http-bio-7070-exec-4] cube.CubeManager:378 : Updating cube instance 'Sample1' 2021-10-31 06:38:37,135 WARN [http-bio-7070-exec-4] model.Segments:463 : NEW segment start does not fit/connect with other segments: Sample1[20200201000000_20200501000000] 2021-10-31 06:38:37,135 WARN [http-bio-7070-exec-4] model.Segments:465 : NEW segment end does not fit/connect with other segments: Sample1[20200201000000_20200501000000] 2021-10-31 06:38:37,136 DEBUG [http-bio-7070-exec-4] cachesync.CachedCrudAssist:227 : Saving CubeInstance at /cube/Sample1.json 2021-10-31 06:38:37,145 DEBUG [pool-3-thread-1] cachesync.Broadcaster:119 : Servers in the cluster: [localhost:7070] 2021-10-31 06:38:37,145 DEBUG [pool-3-thread-1] cachesync.Broadcaster:129 : Announcing new broadcast to all: BroadcastEvent{entity=cube, event=update, cacheKey=Sample1} 2021-10-31 06:38:37,149 DEBUG [http-bio-7070-exec-11] cachesync.Broadcaster:267 : Broadcasting UPDATE, cube, Sample1 2021-10-31 06:38:37,155 DEBUG [http-bio-7070-exec-11] cachesync.Broadcaster:267 : Broadcasting UPDATE, project_data, kylin_s3_first_project 2021-10-31 06:38:37,156 INFO [http-bio-7070-exec-11] service.CacheService:123 : cleaning cache for project kylin_s3_first_project (currently remove all entries) 2021-10-31 06:38:37,158 DEBUG [http-bio-7070-exec-11] cachesync.Broadcaster:301 : Done broadcasting UPDATE, project_data, kylin_s3_first_project 2021-10-31 06:38:37,161 DEBUG [http-bio-7070-exec-11] cachesync.Broadcaster:301 : Done broadcasting UPDATE, cube, Sample1 2021-10-31 06:38:37,163 INFO [http-bio-7070-exec-4] cube.CubeManager:378 : Updating cube instance 'Sample1' 2021-10-31 06:38:37,163 WARN [http-bio-7070-exec-4] model.Segments:463 : NEW segment start does not fit/connect with other segments: Sample1[20200201000000_20200501000000] 2021-10-31 06:38:37,163 WARN [http-bio-7070-exec-4] model.Segments:465 : NEW segment end does not fit/connect with other segments: Sample1[20200201000000_20200501000000] 2021-10-31 06:38:37,164 DEBUG [http-bio-7070-exec-4] cachesync.CachedCrudAssist:227 : Saving CubeInstance at /cube/Sample1.json 2021-10-31 06:38:37,172 DEBUG [pool-3-thread-1] cachesync.Broadcaster:119 : Servers in the cluster: [localhost:7070] 2021-10-31 06:38:37,172 DEBUG [pool-3-thread-1] cachesync.Broadcaster:129 : Announcing new broadcast to all: BroadcastEvent{entity=cube, event=update, cacheKey=Sample1} 2021-10-31 06:38:37,177 DEBUG [http-bio-7070-exec-11] cachesync.Broadcaster:267 : Broadcasting UPDATE, cube, Sample1 2021-10-31 06:38:37,179 DEBUG [http-bio-7070-exec-11] cachesync.Broadcaster:267 : Broadcasting UPDATE, project_data, kylin_s3_first_project 2021-10-31 06:38:37,179 INFO [http-bio-7070-exec-11] service.CacheService:123 : cleaning cache for project kylin_s3_first_project (currently remove all entries) 2021-10-31 06:38:37,180 DEBUG [http-bio-7070-exec-11] cachesync.Broadcaster:301 : Done broadcasting UPDATE, project_data, kylin_s3_first_project 2021-10-31 06:38:37,181 DEBUG [http-bio-7070-exec-11] cachesync.Broadcaster:301 : Done broadcasting UPDATE, cube, Sample1 2021-10-31 06:38:37,195 DEBUG [pool-3-thread-1] cachesync.Broadcaster:119 : Servers in the cluster: [localhost:7070] 2021-10-31 06:38:37,195 DEBUG [pool-3-thread-1] cachesync.Broadcaster:129 : Announcing new broadcast to all: BroadcastEvent{entity=execute_output, event=create, cacheKey=e5d7e027-7055-4c07-a0b3-f44b54f45b70} 2021-10-31 06:38:37,199 DEBUG [http-bio-7070-exec-7] cachesync.Broadcaster:267 : Broadcasting CREATE, execute_output, e5d7e027-7055-4c07-a0b3-f44b54f45b70 2021-10-31 06:38:37,210 DEBUG [http-bio-7070-exec-7] cachesync.Broadcaster:301 : Done broadcasting CREATE, execute_output, e5d7e027-7055-4c07-a0b3-f44b54f45b70 2021-10-31 06:38:37,225 DEBUG [pool-3-thread-1] cachesync.Broadcaster:119 : Servers in the cluster: [localhost:7070] 2021-10-31 06:38:37,225 DEBUG [pool-3-thread-1] cachesync.Broadcaster:129 : Announcing new broadcast to all: BroadcastEvent{entity=execute, event=create, cacheKey=e5d7e027-7055-4c07-a0b3-f44b54f45b70} 2021-10-31 06:38:37,229 DEBUG [http-bio-7070-exec-7] cachesync.Broadcaster:267 : Broadcasting CREATE, execute, e5d7e027-7055-4c07-a0b3-f44b54f45b70 2021-10-31 06:38:37,232 DEBUG [http-bio-7070-exec-7] cachesync.Broadcaster:301 : Done broadcasting CREATE, execute, e5d7e027-7055-4c07-a0b3-f44b54f45b70 2021-10-31 06:38:42,343 DEBUG [BadQueryDetector] service.BadQueryDetector:148 : Detect bad query. 2021-10-31 06:38:45,224 INFO [FetcherRunner 1752764464-37] threadpool.FetcherRunner:65 : NSparkCubingJob{id=e5d7e027-7055-4c07-a0b3-f44b54f45b70, name=BUILD CUBE - Sample1 - 20200201000000_20200501000000 - UTC 2021-10-31 06:38:37, state=READY} prepare to schedule and its priority is 20 2021-10-31 06:38:45,226 INFO [FetcherRunner 1752764464-37] threadpool.FetcherRunner:69 : NSparkCubingJob{id=e5d7e027-7055-4c07-a0b3-f44b54f45b70, name=BUILD CUBE - Sample1 - 20200201000000_20200501000000 - UTC 2021-10-31 06:38:37, state=READY} scheduled 2021-10-31 06:38:45,226 INFO [FetcherRunner 1752764464-37] threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 1 actual running, 0 stopped, 1 ready, 0 already succeed, 0 error, 0 discarded, 0 others 2021-10-31 06:38:45,226 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.AbstractExecutable:188 : Executing AbstractExecutable (BUILD CUBE - Sample1 - 20200201000000_20200501000000 - UTC 2021-10-31 06:38:37) 2021-10-31 06:38:45,231 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.ExecutableManager:676 : job id:e5d7e027-7055-4c07-a0b3-f44b54f45b70 from READY to RUNNING 2021-10-31 06:38:45,238 DEBUG [pool-3-thread-1] cachesync.Broadcaster:119 : Servers in the cluster: [localhost:7070] 2021-10-31 06:38:45,238 DEBUG [pool-3-thread-1] cachesync.Broadcaster:129 : Announcing new broadcast to all: BroadcastEvent{entity=execute_output, event=update, cacheKey=e5d7e027-7055-4c07-a0b3-f44b54f45b70} 2021-10-31 06:38:45,242 DEBUG [http-bio-7070-exec-7] cachesync.Broadcaster:267 : Broadcasting UPDATE, execute_output, e5d7e027-7055-4c07-a0b3-f44b54f45b70 2021-10-31 06:38:45,245 DEBUG [http-bio-7070-exec-7] cachesync.Broadcaster:301 : Done broadcasting UPDATE, execute_output, e5d7e027-7055-4c07-a0b3-f44b54f45b70 2021-10-31 06:38:45,251 DEBUG [pool-3-thread-1] cachesync.Broadcaster:119 : Servers in the cluster: [localhost:7070] 2021-10-31 06:38:45,251 DEBUG [pool-3-thread-1] cachesync.Broadcaster:129 : Announcing new broadcast to all: BroadcastEvent{entity=execute_output, event=update, cacheKey=e5d7e027-7055-4c07-a0b3-f44b54f45b70} 2021-10-31 06:38:45,254 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.AbstractExecutable:417 : The state of job is:RUNNING 2021-10-31 06:38:45,255 DEBUG [http-bio-7070-exec-7] cachesync.Broadcaster:267 : Broadcasting UPDATE, execute_output, e5d7e027-7055-4c07-a0b3-f44b54f45b70 2021-10-31 06:38:45,257 DEBUG [http-bio-7070-exec-7] cachesync.Broadcaster:301 : Done broadcasting UPDATE, execute_output, e5d7e027-7055-4c07-a0b3-f44b54f45b70 2021-10-31 06:38:45,262 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.AbstractExecutable:188 : Executing AbstractExecutable (Detect Resource) 2021-10-31 06:38:45,266 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.AbstractExecutable:417 : The state of job is:RUNNING 2021-10-31 06:38:45,273 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.ExecutableManager:676 : job id:e5d7e027-7055-4c07-a0b3-f44b54f45b70-00 from READY to RUNNING 2021-10-31 06:38:45,284 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.AbstractExecutable:417 : The state of job is:RUNNING 2021-10-31 06:38:45,288 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] common.KylinConfigBase:124 : SPARK_HOME was set to /etc/hadoop/spark 2021-10-31 06:38:45,288 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] job.NSparkExecutable:130 : write hadoop conf is 2021-10-31 06:38:45,390 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.AbstractExecutable:417 : The state of job is:RUNNING 2021-10-31 06:38:45,406 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.ExecutableManager:721 : Update JobOutput To HDFS for e5d7e027-7055-4c07-a0b3-f44b54f45b70-00 to s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/spark_logs/driver/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/execute_output.json [-1] 2021-10-31 06:38:45,682 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] common.KylinConfigBase:263 : Kylin Config was updated with kylin.metadata.url : /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta 2021-10-31 06:38:45,683 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceStore:90 : Using metadata url /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta for resource store 2021-10-31 06:38:45,719 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] utils.MetaDumpUtil:110 : Dump resources to /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta took 37 ms 2021-10-31 06:38:45,720 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] common.KylinConfigBase:263 : Kylin Config was updated with kylin.metadata.url.identifier : kylin4 2021-10-31 06:38:45,720 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] common.KylinConfigBase:263 : Kylin Config was updated with kylin.log.spark-executor-properties-file : /etc/hadoop/apache-kylin-4.0.0-bin-spark3/conf/spark-executor-log4j.properties 2021-10-31 06:38:45,722 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] common.KylinConfigBase:263 : Kylin Config was updated with kylin.metadata.url : /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta 2021-10-31 06:38:45,722 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceStore:90 : Using metadata url /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta for resource store 2021-10-31 06:38:45,722 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceStore:90 : Using metadata url kylin4@hdfs,path=s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta for resource store 2021-10-31 06:38:45,820 WARN [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.HDFSResourceStore:69 : Path not exist in HDFS, create it: s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta. 2021-10-31 06:38:46,097 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.HDFSResourceStore:84 : hdfs meta path created: s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta 2021-10-31 06:38:46,097 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.HDFSResourceStore:74 : hdfs meta path : s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta 2021-10-31 06:38:46,409 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:222 : Copy from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:46,418 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /cube/Sample1.json from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:46,678 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /cube_desc/Sample1.json from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:46,949 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /kylin.properties from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:47,146 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /model_desc/audience_model.json from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:47,394 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /project/kylin_s3_first_project.json from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:47,638 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /table/FENIX.DIM_CATEGORIES--kylin_s3_first_project.json from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:47,876 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /table/FENIX.DIM_CHANNELS--kylin_s3_first_project.json from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:48,140 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /table/FENIX.DIM_GEOGRAPHIES--kylin_s3_first_project.json from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:48,379 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /table/FENIX.DIM_TIME--kylin_s3_first_project.json from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:48,618 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /table/FENIX.FACT_AUDIENCE_CUBE_INPUT--kylin_s3_first_project.json from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:48,827 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /table_exd/FENIX.DIM_CATEGORIES--kylin_s3_first_project.json from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:49,039 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /table_exd/FENIX.DIM_CHANNELS--kylin_s3_first_project.json from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:49,284 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /table_exd/FENIX.DIM_GEOGRAPHIES--kylin_s3_first_project.json from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:49,500 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /table_exd/FENIX.DIM_TIME--kylin_s3_first_project.json from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:49,725 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceTool:274 : Copy path: /table_exd/FENIX.FACT_AUDIENCE_CUBE_INPUT--kylin_s3_first_project.json from /etc/hadoop/apache-kylin-4.0.0-bin-spark3/bin/../tomcat/temp/kylin_job_meta4129325219411085764/meta to org.apache.kylin.common.persistence.HDFSResourceStore@4c59d749 2021-10-31 06:38:50,172 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] job.NSparkExecutable:192 : Spark job args json is : {"distMetaUrl":"kylin4@hdfs,path=s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta","submitter":"ADMIN","dataRangeEnd":"1588291200000","targetModel":"e5bcf97d-2d30-3172-e043-ffcab6e877d8","dataRangeStart":"1580515200000","project":"kylin_s3_first_project","className":"org.apache.kylin.engine.spark.job.ResourceDetectBeforeCubingJob","segmentName":"20200201000000_20200501000000","parentId":"e5d7e027-7055-4c07-a0b3-f44b54f45b70","jobId":"e5d7e027-7055-4c07-a0b3-f44b54f45b70","outputMetaUrl":"kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10","segmentId":"ec5618e8-194b-29b7-6f88-270f207c62a2","cuboidsNum":"1","cubeName":"Sample1","jobType":"BUILD","cubeId":"e2be8719-8049-c733-56ad-b5f5d7c1606f","segmentIds":"ec5618e8-194b-29b7-6f88-270f207c62a2"}. 2021-10-31 06:38:50,252 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:92 : Executor task org.apache.kylin.engine.spark.job.ResourceDetectBeforeCubingJob with args : {"distMetaUrl":"kylin4@hdfs,path=s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta","submitter":"ADMIN","dataRangeEnd":"1588291200000","targetModel":"e5bcf97d-2d30-3172-e043-ffcab6e877d8","dataRangeStart":"1580515200000","project":"kylin_s3_first_project","className":"org.apache.kylin.engine.spark.job.ResourceDetectBeforeCubingJob","segmentName":"20200201000000_20200501000000","parentId":"e5d7e027-7055-4c07-a0b3-f44b54f45b70","jobId":"e5d7e027-7055-4c07-a0b3-f44b54f45b70","outputMetaUrl":"kylin4@jdbc,url=jdbc:mysql://sdash3.cf2dm3apdbvj.eu-west-1.rds.amazonaws.com:3306/kylin4,username=sdash,password=3XNlLq&XK2KQPGS!,maxActive=10,maxIdle=10","segmentId":"ec5618e8-194b-29b7-6f88-270f207c62a2","cuboidsNum":"1","cubeName":"Sample1","jobType":"BUILD","cubeId":"e2be8719-8049-c733-56ad-b5f5d7c1606f","segmentIds":"ec5618e8-194b-29b7-6f88-270f207c62a2"} 2021-10-31 06:38:50,252 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] utils.MetaDumpUtil:118 : Ready to load KylinConfig from uri: kylin4@hdfs,path=s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta 2021-10-31 06:38:50,305 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] common.KylinConfigBase:263 : Kylin Config was updated with kylin.metadata.url.identifier : kylin4 2021-10-31 06:38:50,306 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] common.KylinConfigBase:263 : Kylin Config was updated with kylin.log.spark-executor-properties-file : /etc/hadoop/apache-kylin-4.0.0-bin-spark3/conf/spark-executor-log4j.properties 2021-10-31 06:38:50,306 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] common.KylinConfigBase:263 : Kylin Config was updated with kylin.source.provider.0 : org.apache.kylin.engine.spark.source.HiveSource 2021-10-31 06:38:50,813 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:323 : Start set spark conf automatically. 2021-10-31 06:38:52,430 WARN [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:239 : Auto set spark conf failed. Load spark conf from system properties java.io.FileNotFoundException: No such file or directory: s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70/share at org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:2269) at org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:2163) at org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:2102) at org.apache.hadoop.fs.s3a.S3AFileSystem.innerListStatus(S3AFileSystem.java:1903) at org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$listStatus$9(S3AFileSystem.java:1882) at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:109) at org.apache.hadoop.fs.s3a.S3AFileSystem.listStatus(S3AFileSystem.java:1882) at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1868) at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1910) at org.apache.spark.sql.hive.utils.ResourceDetectUtils$.listSourcePath(ResourceDetectUtils.scala:65) at org.apache.spark.sql.hive.utils.ResourceDetectUtils$.getMaxResourceSize(ResourceDetectUtils.scala:103) at org.apache.spark.sql.hive.utils.ResourceDetectUtils.getMaxResourceSize(ResourceDetectUtils.scala) at org.apache.kylin.engine.spark.application.SparkApplication.chooseContentSize(SparkApplication.java:342) at org.apache.kylin.engine.spark.application.SparkApplication.autoSetSparkConf(SparkApplication.java:327) at org.apache.kylin.engine.spark.application.SparkApplication.execute(SparkApplication.java:237) at org.apache.kylin.engine.spark.application.SparkApplication.execute(SparkApplication.java:93) at org.apache.kylin.engine.spark.job.ResourceDetectBeforeCubingJob.main(ResourceDetectBeforeCubingJob.java:106) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.kylin.engine.spark.job.NSparkExecutable.runLocalMode(NSparkExecutable.java:451) at org.apache.kylin.engine.spark.job.NSparkExecutable.doWork(NSparkExecutable.java:161) at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:206) at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:94) at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:206) at org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:113) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 2021-10-31 06:38:52,431 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:243 : Override user-defined spark conf, set spark.sql.hive.metastore.version=2.3.7. 2021-10-31 06:38:52,432 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:243 : Override user-defined spark conf, set spark.yarn.queue=default. 2021-10-31 06:38:52,432 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:243 : Override user-defined spark conf, set spark.history.fs.logDirectory=s3a://dubizzle-data/fenix/kylin_spark/spark-history. 2021-10-31 06:38:52,432 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:243 : Override user-defined spark conf, set spark.driver.extraJavaOptions=-XX:+CrashOnOutOfMemoryError. 2021-10-31 06:38:52,432 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:243 : Override user-defined spark conf, set spark.master=spark://ip-192-168-0-29.eu-west-1.compute.internal:7077. 2021-10-31 06:38:52,433 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:243 : Override user-defined spark conf, set spark.executor.extraJavaOptions=-Dfile.encoding=UTF-8 -Dhdp.version=current -Dlog4j.configuration=spark-executor-log4j.properties -Dlog4j.debug -Dkylin.hdfs.working.dir=s3a://dubizzle-data/fenix/kylin/kylin4/ -Dkylin.metadata.identifier=kylin4 -Dkylin.spark.category=job -Dkylin.spark.project=kylin_s3_first_project -Dkylin.spark.identifier=e5d7e027-7055-4c07-a0b3-f44b54f45b70 -Dkylin.spark.jobName=e5d7e027-7055-4c07-a0b3-f44b54f45b70-00 -Duser.timezone=UTC. 2021-10-31 06:38:52,433 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:243 : Override user-defined spark conf, set spark.hadoop.yarn.timeline-service.enabled=false. 2021-10-31 06:38:52,433 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:243 : Override user-defined spark conf, set spark.eventLog.enabled=true. 2021-10-31 06:38:52,433 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:243 : Override user-defined spark conf, set spark.eventLog.dir=s3a://dubizzle-data/fenix/kylin_spark/spark-history. 2021-10-31 06:38:52,433 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:243 : Override user-defined spark conf, set spark.sql.hive.metastore.jars=/etc/hadoop/hive-2.3.7/lib/*. 2021-10-31 06:38:52,433 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:243 : Override user-defined spark conf, set spark.submit.deployMode=client. 2021-10-31 06:38:52,441 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] util.TimeZoneUtils:44 : System timezone set to UTC, TimeZoneId: UTC. 2021-10-31 06:38:52,442 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:270 : Sleep for random seconds to avoid submitting too many spark job at the same time. 2021-10-31 06:38:57,964 WARN [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:280 : Error occurred when check resource. Ignore it and try to submit this job. java.util.NoSuchElementException: spark.driver.memory at org.apache.spark.SparkConf.$anonfun$get$1(SparkConf.scala:245) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.SparkConf.get(SparkConf.scala:245) at org.apache.spark.utils.ResourceUtils$.checkResource(ResourceUtils.scala:70) at org.apache.spark.utils.ResourceUtils.checkResource(ResourceUtils.scala) at org.apache.kylin.engine.spark.application.SparkApplication.execute(SparkApplication.java:273) at org.apache.kylin.engine.spark.application.SparkApplication.execute(SparkApplication.java:93) at org.apache.kylin.engine.spark.job.ResourceDetectBeforeCubingJob.main(ResourceDetectBeforeCubingJob.java:106) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.kylin.engine.spark.job.NSparkExecutable.runLocalMode(NSparkExecutable.java:451) at org.apache.kylin.engine.spark.job.NSparkExecutable.doWork(NSparkExecutable.java:161) at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:206) at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:94) at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:206) at org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:113) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 2021-10-31 06:38:58,012 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] conf.HiveConf:181 : Found configuration file file:/etc/hadoop/hive-2.3.7/conf/hive-site.xml 2021-10-31 06:38:58,188 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] spark.SparkContext:57 : Running Spark version 3.1.1 2021-10-31 06:38:58,233 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] resource.ResourceUtils:57 : ============================================================== 2021-10-31 06:38:58,233 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] resource.ResourceUtils:57 : No custom resources configured for spark.driver. 2021-10-31 06:38:58,234 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] resource.ResourceUtils:57 : ============================================================== 2021-10-31 06:38:58,234 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] spark.SparkContext:57 : Submitted application: bfa2370d-aac1-4bfb-b4f9-0442d63433cc 2021-10-31 06:38:58,260 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] resource.ResourceProfile:57 : Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 2021-10-31 06:38:58,274 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] resource.ResourceProfile:57 : Limiting resource is cpu 2021-10-31 06:38:58,274 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] resource.ResourceProfileManager:57 : Added ResourceProfile id: 0 2021-10-31 06:38:58,341 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] spark.SecurityManager:57 : Changing view acls to: ec2-user 2021-10-31 06:38:58,341 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] spark.SecurityManager:57 : Changing modify acls to: ec2-user 2021-10-31 06:38:58,342 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] spark.SecurityManager:57 : Changing view acls groups to: 2021-10-31 06:38:58,342 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] spark.SecurityManager:57 : Changing modify acls groups to: 2021-10-31 06:38:58,343 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] spark.SecurityManager:57 : SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(ec2-user); groups with view permissions: Set(); users with modify permissions: Set(ec2-user); groups with modify permissions: Set() 2021-10-31 06:38:58,656 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] util.Utils:57 : Successfully started service 'sparkDriver' on port 44017. 2021-10-31 06:38:58,698 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] spark.SparkEnv:57 : Registering MapOutputTracker 2021-10-31 06:38:58,733 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] spark.SparkEnv:57 : Registering BlockManagerMaster 2021-10-31 06:38:58,758 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] storage.BlockManagerMasterEndpoint:57 : Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2021-10-31 06:38:58,759 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] storage.BlockManagerMasterEndpoint:57 : BlockManagerMasterEndpoint up 2021-10-31 06:38:58,763 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] spark.SparkEnv:57 : Registering BlockManagerMasterHeartbeat 2021-10-31 06:38:58,778 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] storage.DiskBlockManager:57 : Created local directory at /etc/hadoop/apache-kylin-4.0.0-bin-spark3/tomcat/temp/blockmgr-c44d2f88-5e0d-4f07-8fac-132d235eaedf 2021-10-31 06:38:58,815 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] memory.MemoryStore:57 : MemoryStore started with capacity 2004.6 MiB 2021-10-31 06:38:58,834 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] spark.SparkEnv:57 : Registering OutputCommitCoordinator 2021-10-31 06:38:58,929 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] util.log:169 : Logging initialized @84208ms to org.sparkproject.jetty.util.log.Slf4jLog 2021-10-31 06:38:59,003 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] server.Server:375 : jetty-9.4.36.v20210114; built: 2021-01-14T16:44:28.689Z; git: 238ec6997c7806b055319a6d11f8ae7564adc0de; jvm 1.8.0-292-b10 2021-10-31 06:38:59,027 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] server.Server:415 : Started @84306ms 2021-10-31 06:38:59,065 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] server.AbstractConnector:331 : Started ServerConnector@7de44d16{HTTP/1.1, (http/1.1)}{0.0.0.0:4040} 2021-10-31 06:38:59,065 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] util.Utils:57 : Successfully started service 'SparkUI' on port 4040. 2021-10-31 06:38:59,093 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@786a514a{/jobs,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,096 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@2226f01{/jobs/json,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,097 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@74e6b443{/jobs/job,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,097 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@f51ca75{/jobs/job/json,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,098 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@712e05d1{/stages,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,098 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@79b43b13{/stages/json,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,099 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@17a8fa24{/stages/stage,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,100 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@b94f7d{/stages/stage/json,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,101 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@7e5bf9dc{/stages/pool,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,101 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@63a0cc87{/stages/pool/json,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,102 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@79c425df{/storage,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,103 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@70423fbc{/storage/json,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,103 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@754d719f{/storage/rdd,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,104 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@19968d3a{/storage/rdd/json,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,105 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@50f8ea2b{/environment,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,105 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@424ff782{/environment/json,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,106 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@26fa79f0{/executors,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,107 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@1eecc8d0{/executors/json,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,107 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@79d88caa{/executors/threadDump,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,109 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@34dbdc09{/executors/threadDump/json,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,120 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@47047455{/static,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,121 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@4c46af25{/,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,122 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@71d83896{/api,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,123 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@3c403263{/jobs/job/kill,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,124 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@61847539{/stages/stage/kill,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,126 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] ui.SparkUI:57 : Bound SparkUI to 0.0.0.0, and started at http://ip-192-168-0-29.eu-west-1.compute.internal:4040 2021-10-31 06:38:59,240 INFO [appclient-register-master-threadpool-0] client.StandaloneAppClient$ClientEndpoint:57 : Connecting to master spark://ip-192-168-0-29.eu-west-1.compute.internal:7077... 2021-10-31 06:38:59,298 INFO [netty-rpc-connection-0] client.TransportClientFactory:309 : Successfully created connection to ip-192-168-0-29.eu-west-1.compute.internal/192.168.0.29:7077 after 43 ms (0 ms spent in bootstraps) 2021-10-31 06:38:59,389 INFO [dispatcher-event-loop-3] cluster.StandaloneSchedulerBackend:57 : Connected to Spark cluster with app ID app-20211031063859-0005 2021-10-31 06:38:59,390 INFO [dispatcher-event-loop-3] client.StandaloneAppClient$ClientEndpoint:57 : Executor added: app-20211031063859-0005/0 on worker-20211029054423-192.168.0.29-45465 (192.168.0.29:45465) with 4 core(s) 2021-10-31 06:38:59,406 INFO [dispatcher-event-loop-3] cluster.StandaloneSchedulerBackend:57 : Granted executor ID app-20211031063859-0005/0 on hostPort 192.168.0.29:45465 with 4 core(s), 1024.0 MiB RAM 2021-10-31 06:38:59,409 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] util.Utils:57 : Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 38671. 2021-10-31 06:38:59,409 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] netty.NettyBlockTransferService:81 : Server created on ip-192-168-0-29.eu-west-1.compute.internal:38671 2021-10-31 06:38:59,411 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] storage.BlockManager:57 : Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 2021-10-31 06:38:59,422 INFO [dispatcher-event-loop-0] client.StandaloneAppClient$ClientEndpoint:57 : Executor updated: app-20211031063859-0005/0 is now RUNNING 2021-10-31 06:38:59,423 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] storage.BlockManagerMaster:57 : Registering BlockManager BlockManagerId(driver, ip-192-168-0-29.eu-west-1.compute.internal, 38671, None) 2021-10-31 06:38:59,426 INFO [dispatcher-BlockManagerMaster] storage.BlockManagerMasterEndpoint:57 : Registering block manager ip-192-168-0-29.eu-west-1.compute.internal:38671 with 2004.6 MiB RAM, BlockManagerId(driver, ip-192-168-0-29.eu-west-1.compute.internal, 38671, None) 2021-10-31 06:38:59,429 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] storage.BlockManagerMaster:57 : Registered BlockManager BlockManagerId(driver, ip-192-168-0-29.eu-west-1.compute.internal, 38671, None) 2021-10-31 06:38:59,431 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] storage.BlockManager:57 : Initialized BlockManager: BlockManagerId(driver, ip-192-168-0-29.eu-west-1.compute.internal, 38671, None) 2021-10-31 06:38:59,620 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@59d82896{/metrics/json,null,AVAILABLE,@Spark} 2021-10-31 06:38:59,766 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] history.SingleEventLogFileWriter:57 : Logging events to s3a://dubizzle-data/fenix/kylin_spark/spark-history/app-20211031063859-0005.inprogress 2021-10-31 06:38:59,974 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cluster.StandaloneSchedulerBackend:57 : SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0 2021-10-31 06:39:00,068 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] client.RMProxy:133 : Connecting to ResourceManager at /0.0.0.0:8032 2021-10-31 06:39:00,419 WARN [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:206 : Get tracking url of application app-20211031063859-0005, but empty url found. 2021-10-31 06:39:00,445 DEBUG [pool-3-thread-1] cachesync.Broadcaster:119 : Servers in the cluster: [localhost:7070] 2021-10-31 06:39:00,445 DEBUG [pool-3-thread-1] cachesync.Broadcaster:129 : Announcing new broadcast to all: BroadcastEvent{entity=execute_output, event=update, cacheKey=e5d7e027-7055-4c07-a0b3-f44b54f45b70} 2021-10-31 06:39:00,457 DEBUG [http-bio-7070-exec-7] cachesync.Broadcaster:267 : Broadcasting UPDATE, execute_output, e5d7e027-7055-4c07-a0b3-f44b54f45b70 2021-10-31 06:39:00,460 DEBUG [http-bio-7070-exec-7] cachesync.Broadcaster:301 : Done broadcasting UPDATE, execute_output, e5d7e027-7055-4c07-a0b3-f44b54f45b70 2021-10-31 06:39:00,522 WARN [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] internal.SharedState:69 : URL.setURLStreamHandlerFactory failed to set FsUrlStreamHandlerFactory 2021-10-31 06:39:00,523 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] internal.SharedState:57 : Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/etc/hadoop/spark-warehouse'). 2021-10-31 06:39:00,524 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] internal.SharedState:57 : Warehouse path is 'file:/etc/hadoop/spark-warehouse'. 2021-10-31 06:39:00,541 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@3d92b237{/SQL,null,AVAILABLE,@Spark} 2021-10-31 06:39:00,541 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@2ab5363b{/SQL/json,null,AVAILABLE,@Spark} 2021-10-31 06:39:00,542 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@369231a1{/SQL/execution,null,AVAILABLE,@Spark} 2021-10-31 06:39:00,543 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@251e2cc7{/SQL/execution/json,null,AVAILABLE,@Spark} 2021-10-31 06:39:00,545 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] handler.ContextHandler:916 : Started o.s.j.s.ServletContextHandler@16345fd1{/static/sql,null,AVAILABLE,@Spark} 2021-10-31 06:39:01,246 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] job.ResourceDetectBeforeCubingJob:56 : Start detect resource before cube. 2021-10-31 06:39:01,246 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.cube.CubeManager 2021-10-31 06:39:01,246 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cube.CubeManager:122 : Initializing CubeManager with config kylin4@hdfs,path=s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta 2021-10-31 06:39:01,246 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.ResourceStore:90 : Using metadata url kylin4@hdfs,path=s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta for resource store 2021-10-31 06:39:01,305 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] persistence.HDFSResourceStore:74 : hdfs meta path : s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta 2021-10-31 06:39:01,334 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.CachedCrudAssist:122 : Reloading CubeInstance from s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta/cube 2021-10-31 06:39:01,576 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.cube.CubeDescManager 2021-10-31 06:39:01,576 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cube.CubeDescManager:91 : Initializing CubeDescManager with config kylin4@hdfs,path=s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta 2021-10-31 06:39:01,577 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.CachedCrudAssist:122 : Reloading CubeDesc from s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta/cube_desc 2021-10-31 06:39:01,732 INFO [dispatcher-CoarseGrainedScheduler] cluster.CoarseGrainedSchedulerBackend$DriverEndpoint:57 : Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.0.29:57780) with ID 0, ResourceProfileId 0 2021-10-31 06:39:01,778 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.metadata.project.ProjectManager 2021-10-31 06:39:01,778 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] project.ProjectManager:81 : Initializing ProjectManager with metadata url kylin4@hdfs,path=s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta 2021-10-31 06:39:01,778 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.CachedCrudAssist:122 : Reloading ProjectInstance from s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta/project 2021-10-31 06:39:01,861 INFO [dispatcher-BlockManagerMaster] storage.BlockManagerMasterEndpoint:57 : Registering block manager 192.168.0.29:33575 with 366.3 MiB RAM, BlockManagerId(0, 192.168.0.29, 33575, None) 2021-10-31 06:39:01,965 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.CachedCrudAssist:155 : Loaded 1 ProjectInstance(s) out of 1 resource with 0 errors 2021-10-31 06:39:01,966 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.metadata.cachesync.Broadcaster 2021-10-31 06:39:01,966 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.Broadcaster:102 : 1 nodes in the cluster: [localhost:7070] 2021-10-31 06:39:01,966 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.metadata.model.DataModelManager 2021-10-31 06:39:01,966 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] common.KylinConfig:493 : Creating new manager instance of class org.apache.kylin.metadata.TableMetadataManager 2021-10-31 06:39:01,967 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.CachedCrudAssist:122 : Reloading TableDesc from s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta/table 2021-10-31 06:39:02,259 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.CachedCrudAssist:155 : Loaded 5 TableDesc(s) out of 5 resource with 0 errors 2021-10-31 06:39:02,259 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.CachedCrudAssist:122 : Reloading TableExtDesc from s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta/table_exd 2021-10-31 06:39:02,553 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.CachedCrudAssist:155 : Loaded 5 TableExtDesc(s) out of 5 resource with 0 errors 2021-10-31 06:39:02,554 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.CachedCrudAssist:122 : Reloading ExternalFilterDesc from s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta/ext_filter 2021-10-31 06:39:02,591 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.CachedCrudAssist:155 : Loaded 0 ExternalFilterDesc(s) out of 0 resource with 0 errors 2021-10-31 06:39:02,591 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.CachedCrudAssist:122 : Reloading DataModelDesc from s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/job_tmp/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/meta/model_desc 2021-10-31 06:39:02,787 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.CachedCrudAssist:155 : Loaded 1 DataModelDesc(s) out of 1 resource with 0 errors 2021-10-31 06:39:02,787 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.CachedCrudAssist:155 : Loaded 1 CubeDesc(s) out of 1 resource with 0 errors 2021-10-31 06:39:02,787 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] cachesync.CachedCrudAssist:155 : Loaded 1 CubeInstance(s) out of 1 resource with 0 errors 2021-10-31 06:39:03,516 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] conf.HiveConf:181 : Found configuration file file:/etc/hadoop/hive-2.3.7/conf/hive-site.xml 2021-10-31 06:39:03,535 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] hive.HiveUtils:57 : Initializing HiveMetastoreConnection version 2.3.7 using file:/etc/hadoop/hive-2.3.7/lib/hive-common-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-shims-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-shims-common-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/log4j-slf4j-impl-2.6.2.jar:file:/etc/hadoop/hive-2.3.7/lib/log4j-api-2.6.2.jar:file:/etc/hadoop/hive-2.3.7/lib/guava-14.0.1.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-lang-2.6.jar:file:/etc/hadoop/hive-2.3.7/lib/libthrift-0.9.3.jar:file:/etc/hadoop/hive-2.3.7/lib/httpclient-4.4.jar:file:/etc/hadoop/hive-2.3.7/lib/httpcore-4.4.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-logging-1.2.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-codec-1.4.jar:file:/etc/hadoop/hive-2.3.7/lib/curator-framework-2.7.1.jar:file:/etc/hadoop/hive-2.3.7/lib/curator-client-2.7.1.jar:file:/etc/hadoop/hive-2.3.7/lib/zookeeper-3.4.6.jar:file:/etc/hadoop/hive-2.3.7/lib/jline-2.12.jar:file:/etc/hadoop/hive-2.3.7/lib/netty-3.6.2.Final.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-shims-0.23-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/guice-servlet-4.1.0.jar:file:/etc/hadoop/hive-2.3.7/lib/javax.inject-1.jar:file:/etc/hadoop/hive-2.3.7/lib/protobuf-java-2.5.0.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-io-2.4.jar:file:/etc/hadoop/hive-2.3.7/lib/jettison-1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/activation-1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/jackson-jaxrs-1.9.13.jar:file:/etc/hadoop/hive-2.3.7/lib/jackson-xc-1.9.13.jar:file:/etc/hadoop/hive-2.3.7/lib/jersey-guice-1.19.jar:file:/etc/hadoop/hive-2.3.7/lib/jersey-server-1.14.jar:file:/etc/hadoop/hive-2.3.7/lib/asm-3.1.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-compress-1.9.jar:file:/etc/hadoop/hive-2.3.7/lib/jetty-util-6.1.26.jar:file:/etc/hadoop/hive-2.3.7/lib/jersey-client-1.9.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-cli-1.2.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-collections-3.2.2.jar:file:/etc/hadoop/hive-2.3.7/lib/jetty-6.1.26.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-shims-scheduler-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-storage-api-2.4.0.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-lang3-3.1.jar:file:/etc/hadoop/hive-2.3.7/lib/orc-core-1.3.4.jar:file:/etc/hadoop/hive-2.3.7/lib/aircompressor-0.8.jar:file:/etc/hadoop/hive-2.3.7/lib/slice-0.29.jar:file:/etc/hadoop/hive-2.3.7/lib/jol-core-0.2.jar:file:/etc/hadoop/hive-2.3.7/lib/jetty-all-7.6.0.v20120127.jar:file:/etc/hadoop/hive-2.3.7/lib/geronimo-jta_1.1_spec-1.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/mail-1.4.1.jar:file:/etc/hadoop/hive-2.3.7/lib/geronimo-jaspic_1.0_spec-1.0.jar:file:/etc/hadoop/hive-2.3.7/lib/geronimo-annotation_1.0_spec-1.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/asm-commons-3.1.jar:file:/etc/hadoop/hive-2.3.7/lib/asm-tree-3.1.jar:file:/etc/hadoop/hive-2.3.7/lib/javax.servlet-3.0.0.v201112011016.jar:file:/etc/hadoop/hive-2.3.7/lib/joda-time-2.8.1.jar:file:/etc/hadoop/hive-2.3.7/lib/log4j-1.2-api-2.6.2.jar:file:/etc/hadoop/hive-2.3.7/lib/log4j-core-2.6.2.jar:file:/etc/hadoop/hive-2.3.7/lib/log4j-web-2.6.2.jar:file:/etc/hadoop/hive-2.3.7/lib/ant-1.9.1.jar:file:/etc/hadoop/hive-2.3.7/lib/ant-launcher-1.9.1.jar:file:/etc/hadoop/hive-2.3.7/lib/json-1.8.jar:file:/etc/hadoop/hive-2.3.7/lib/metrics-core-3.1.0.jar:file:/etc/hadoop/hive-2.3.7/lib/metrics-jvm-3.1.0.jar:file:/etc/hadoop/hive-2.3.7/lib/metrics-json-3.1.0.jar:file:/etc/hadoop/hive-2.3.7/lib/jackson-databind-2.6.5.jar:file:/etc/hadoop/hive-2.3.7/lib/jackson-annotations-2.6.0.jar:file:/etc/hadoop/hive-2.3.7/lib/jackson-core-2.6.5.jar:file:/etc/hadoop/hive-2.3.7/lib/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-math3-3.6.1.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-httpclient-3.0.1.jar:file:/etc/hadoop/hive-2.3.7/lib/servlet-api-2.4.jar:file:/etc/hadoop/hive-2.3.7/lib/jsp-api-2.1.jar:file:/etc/hadoop/hive-2.3.7/lib/avro-1.7.7.jar:file:/etc/hadoop/hive-2.3.7/lib/paranamer-2.3.jar:file:/etc/hadoop/hive-2.3.7/lib/snappy-java-1.0.5.jar:file:/etc/hadoop/hive-2.3.7/lib/gson-2.2.4.jar:file:/etc/hadoop/hive-2.3.7/lib/curator-recipes-2.7.1.jar:file:/etc/hadoop/hive-2.3.7/lib/jsr305-3.0.0.jar:file:/etc/hadoop/hive-2.3.7/lib/htrace-core-3.1.0-incubating.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-serde-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-service-rpc-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/jasper-compiler-5.5.23.jar:file:/etc/hadoop/hive-2.3.7/lib/jsp-api-2.0.jar:file:/etc/hadoop/hive-2.3.7/lib/ant-1.6.5.jar:file:/etc/hadoop/hive-2.3.7/lib/jasper-runtime-5.5.23.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-el-1.0.jar:file:/etc/hadoop/hive-2.3.7/lib/libfb303-0.9.3.jar:file:/etc/hadoop/hive-2.3.7/lib/opencsv-2.3.jar:file:/etc/hadoop/hive-2.3.7/lib/parquet-hadoop-bundle-1.8.1.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-metastore-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/javolution-5.5.1.jar:file:/etc/hadoop/hive-2.3.7/lib/hbase-client-1.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/hbase-annotations-1.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/findbugs-annotations-1.3.9-1.jar:file:/etc/hadoop/hive-2.3.7/lib/hbase-common-1.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/hbase-protocol-1.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/netty-all-4.0.52.Final.jar:file:/etc/hadoop/hive-2.3.7/lib/jcodings-1.0.8.jar:file:/etc/hadoop/hive-2.3.7/lib/joni-2.1.2.jar:file:/etc/hadoop/hive-2.3.7/lib/bonecp-0.8.0.RELEASE.jar:file:/etc/hadoop/hive-2.3.7/lib/HikariCP-2.5.1.jar:file:/etc/hadoop/hive-2.3.7/lib/derby-10.10.2.0.jar:file:/etc/hadoop/hive-2.3.7/lib/datanucleus-api-jdo-4.2.4.jar:file:/etc/hadoop/hive-2.3.7/lib/datanucleus-core-4.1.17.jar:file:/etc/hadoop/hive-2.3.7/lib/datanucleus-rdbms-4.1.19.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-pool-1.5.4.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-dbcp-1.4.jar:file:/etc/hadoop/hive-2.3.7/lib/jdo-api-3.0.1.jar:file:/etc/hadoop/hive-2.3.7/lib/jta-1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/jpam-1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/javax.jdo-3.2.0-m3.jar:file:/etc/hadoop/hive-2.3.7/lib/transaction-api-1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/antlr-runtime-3.5.2.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-testutils-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/tempus-fugit-1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-exec-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-vector-code-gen-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/velocity-1.5.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-llap-tez-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-llap-client-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-llap-common-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/ST4-4.0.4.jar:file:/etc/hadoop/hive-2.3.7/lib/ivy-2.4.0.jar:file:/etc/hadoop/hive-2.3.7/lib/groovy-all-2.4.4.jar:file:/etc/hadoop/hive-2.3.7/lib/calcite-core-1.10.0.jar:file:/etc/hadoop/hive-2.3.7/lib/avatica-1.8.0.jar:file:/etc/hadoop/hive-2.3.7/lib/avatica-metrics-1.8.0.jar:file:/etc/hadoop/hive-2.3.7/lib/calcite-linq4j-1.10.0.jar:file:/etc/hadoop/hive-2.3.7/lib/eigenbase-properties-1.1.5.jar:file:/etc/hadoop/hive-2.3.7/lib/janino-2.7.6.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-compiler-2.7.6.jar:file:/etc/hadoop/hive-2.3.7/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:file:/etc/hadoop/hive-2.3.7/lib/calcite-druid-1.10.0.jar:file:/etc/hadoop/hive-2.3.7/lib/stax-api-1.0.1.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-service-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-llap-server-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/slider-core-0.90.2-incubating.jar:file:/etc/hadoop/hive-2.3.7/lib/jcommander-1.32.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-llap-common-2.3.7-tests.jar:file:/etc/hadoop/hive-2.3.7/lib/hbase-hadoop2-compat-1.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/hbase-hadoop-compat-1.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-math-2.2.jar:file:/etc/hadoop/hive-2.3.7/lib/metrics-core-2.2.0.jar:file:/etc/hadoop/hive-2.3.7/lib/hbase-server-1.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/hbase-procedure-1.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/hbase-common-1.1.1-tests.jar:file:/etc/hadoop/hive-2.3.7/lib/hbase-prefix-tree-1.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/jetty-sslengine-6.1.26.jar:file:/etc/hadoop/hive-2.3.7/lib/jsp-2.1-6.1.14.jar:file:/etc/hadoop/hive-2.3.7/lib/jsp-api-2.1-6.1.14.jar:file:/etc/hadoop/hive-2.3.7/lib/servlet-api-2.5-6.1.14.jar:file:/etc/hadoop/hive-2.3.7/lib/jamon-runtime-2.3.1.jar:file:/etc/hadoop/hive-2.3.7/lib/disruptor-3.3.0.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-jdbc-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-beeline-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/super-csv-2.2.0.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-cli-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-contrib-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-hbase-handler-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hbase-hadoop2-compat-1.1.1-tests.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-druid-handler-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/java-util-0.27.10.jar:file:/etc/hadoop/hive-2.3.7/lib/config-magic-0.9.jar:file:/etc/hadoop/hive-2.3.7/lib/rhino-1.7R5.jar:file:/etc/hadoop/hive-2.3.7/lib/json-path-2.1.0.jar:file:/etc/hadoop/hive-2.3.7/lib/druid-server-0.9.2.jar:file:/etc/hadoop/hive-2.3.7/lib/druid-processing-0.9.2.jar:file:/etc/hadoop/hive-2.3.7/lib/druid-common-0.9.2.jar:file:/etc/hadoop/hive-2.3.7/lib/druid-api-0.9.2.jar:file:/etc/hadoop/hive-2.3.7/lib/guice-multibindings-4.1.0.jar:file:/etc/hadoop/hive-2.3.7/lib/airline-0.7.jar:file:/etc/hadoop/hive-2.3.7/lib/jackson-dataformat-smile-2.4.6.jar:file:/etc/hadoop/hive-2.3.7/lib/hibernate-validator-5.1.3.Final.jar:file:/etc/hadoop/hive-2.3.7/lib/validation-api-1.1.0.Final.jar:file:/etc/hadoop/hive-2.3.7/lib/jboss-logging-3.1.3.GA.jar:file:/etc/hadoop/hive-2.3.7/lib/classmate-1.0.0.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-dbcp2-2.0.1.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-pool2-2.2.jar:file:/etc/hadoop/hive-2.3.7/lib/javax.el-api-3.0.0.jar:file:/etc/hadoop/hive-2.3.7/lib/jackson-datatype-guava-2.4.6.jar:file:/etc/hadoop/hive-2.3.7/lib/jackson-datatype-joda-2.4.6.jar:file:/etc/hadoop/hive-2.3.7/lib/jdbi-2.63.1.jar:file:/etc/hadoop/hive-2.3.7/lib/log4j-jul-2.5.jar:file:/etc/hadoop/hive-2.3.7/lib/antlr4-runtime-4.5.jar:file:/etc/hadoop/hive-2.3.7/lib/bytebuffer-collections-0.2.5.jar:file:/etc/hadoop/hive-2.3.7/lib/extendedset-1.3.10.jar:file:/etc/hadoop/hive-2.3.7/lib/RoaringBitmap-0.5.18.jar:file:/etc/hadoop/hive-2.3.7/lib/emitter-0.3.6.jar:file:/etc/hadoop/hive-2.3.7/lib/http-client-1.0.4.jar:file:/etc/hadoop/hive-2.3.7/lib/server-metrics-0.2.8.jar:file:/etc/hadoop/hive-2.3.7/lib/compress-lzf-1.0.3.jar:file:/etc/hadoop/hive-2.3.7/lib/icu4j-4.8.1.jar:file:/etc/hadoop/hive-2.3.7/lib/lz4-1.3.0.jar:file:/etc/hadoop/hive-2.3.7/lib/mapdb-1.0.8.jar:file:/etc/hadoop/hive-2.3.7/lib/druid-console-0.0.2.jar:file:/etc/hadoop/hive-2.3.7/lib/javax.el-3.0.0.jar:file:/etc/hadoop/hive-2.3.7/lib/curator-x-discovery-2.11.0.jar:file:/etc/hadoop/hive-2.3.7/lib/jackson-jaxrs-json-provider-2.4.6.jar:file:/etc/hadoop/hive-2.3.7/lib/jackson-jaxrs-base-2.4.6.jar:file:/etc/hadoop/hive-2.3.7/lib/jackson-module-jaxb-annotations-2.4.6.jar:file:/etc/hadoop/hive-2.3.7/lib/jackson-jaxrs-smile-provider-2.4.6.jar:file:/etc/hadoop/hive-2.3.7/lib/jetty-server-9.2.5.v20141112.jar:file:/etc/hadoop/hive-2.3.7/lib/javax.servlet-api-3.1.0.jar:file:/etc/hadoop/hive-2.3.7/lib/jetty-http-9.2.5.v20141112.jar:file:/etc/hadoop/hive-2.3.7/lib/jetty-util-9.2.5.v20141112.jar:file:/etc/hadoop/hive-2.3.7/lib/jetty-io-9.2.5.v20141112.jar:file:/etc/hadoop/hive-2.3.7/lib/jetty-proxy-9.2.5.v20141112.jar:file:/etc/hadoop/hive-2.3.7/lib/jetty-client-9.2.5.v20141112.jar:file:/etc/hadoop/hive-2.3.7/lib/tesla-aether-0.0.5.jar:file:/etc/hadoop/hive-2.3.7/lib/okhttp-1.0.2.jar:file:/etc/hadoop/hive-2.3.7/lib/aether-api-0.9.0.M2.jar:file:/etc/hadoop/hive-2.3.7/lib/aether-spi-0.9.0.M2.jar:file:/etc/hadoop/hive-2.3.7/lib/aether-util-0.9.0.M2.jar:file:/etc/hadoop/hive-2.3.7/lib/aether-impl-0.9.0.M2.jar:file:/etc/hadoop/hive-2.3.7/lib/aether-connector-file-0.9.0.M2.jar:file:/etc/hadoop/hive-2.3.7/lib/aether-connector-okhttp-0.0.9.jar:file:/etc/hadoop/hive-2.3.7/lib/wagon-provider-api-2.4.jar:file:/etc/hadoop/hive-2.3.7/lib/plexus-utils-3.0.15.jar:file:/etc/hadoop/hive-2.3.7/lib/maven-aether-provider-3.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/maven-model-3.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/maven-model-builder-3.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/plexus-interpolation-1.19.jar:file:/etc/hadoop/hive-2.3.7/lib/maven-repository-metadata-3.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/maven-settings-builder-3.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/maven-settings-3.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/spymemcached-2.11.7.jar:file:/etc/hadoop/hive-2.3.7/lib/jetty-servlet-9.2.5.v20141112.jar:file:/etc/hadoop/hive-2.3.7/lib/jetty-security-9.2.5.v20141112.jar:file:/etc/hadoop/hive-2.3.7/lib/jetty-servlets-9.2.5.v20141112.jar:file:/etc/hadoop/hive-2.3.7/lib/jetty-continuation-9.2.5.v20141112.jar:file:/etc/hadoop/hive-2.3.7/lib/irc-api-1.0-0014.jar:file:/etc/hadoop/hive-2.3.7/lib/geoip2-0.4.0.jar:file:/etc/hadoop/hive-2.3.7/lib/maxminddb-0.2.0.jar:file:/etc/hadoop/hive-2.3.7/lib/google-http-client-jackson2-1.15.0-rc.jar:file:/etc/hadoop/hive-2.3.7/lib/derbynet-10.11.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/derbyclient-10.11.1.1.jar:file:/etc/hadoop/hive-2.3.7/lib/druid-hdfs-storage-0.9.2.jar:file:/etc/hadoop/hive-2.3.7/lib/mysql-metadata-storage-0.9.2.jar:file:/etc/hadoop/hive-2.3.7/lib/postgresql-metadata-storage-0.9.2.jar:file:/etc/hadoop/hive-2.3.7/lib/postgresql-9.4.1208.jre7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-jdbc-handler-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-accumulo-handler-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/accumulo-core-1.6.0.jar:file:/etc/hadoop/hive-2.3.7/lib/accumulo-fate-1.6.0.jar:file:/etc/hadoop/hive-2.3.7/lib/accumulo-start-1.6.0.jar:file:/etc/hadoop/hive-2.3.7/lib/commons-vfs2-2.0.jar:file:/etc/hadoop/hive-2.3.7/lib/maven-scm-api-1.4.jar:file:/etc/hadoop/hive-2.3.7/lib/maven-scm-provider-svnexe-1.4.jar:file:/etc/hadoop/hive-2.3.7/lib/maven-scm-provider-svn-commons-1.4.jar:file:/etc/hadoop/hive-2.3.7/lib/regexp-1.3.jar:file:/etc/hadoop/hive-2.3.7/lib/accumulo-trace-1.6.0.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-llap-ext-client-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-hplsql-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/org.abego.treelayout.core-1.0.1.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-hcatalog-core-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/hive-hcatalog-server-extensions-2.3.7.jar:file:/etc/hadoop/hive-2.3.7/lib/mysql-connector-java-5.1.47.jar 2021-10-31 06:39:03,618 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] conf.HiveConf:181 : Found configuration file null 2021-10-31 06:39:03,764 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] Configuration.deprecation:1394 : yarn.resourcemanager.system-metrics-publisher.enabled is deprecated. Instead, use yarn.system-metrics-publisher.enabled 2021-10-31 06:39:03,841 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] session.SessionState:753 : Created HDFS directory: /tmp/hive/ec2-user/fcf50c1e-572b-4d08-a337-88117fc7d7d3 2021-10-31 06:39:03,845 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] session.SessionState:753 : Created local directory: /etc/hadoop/apache-kylin-4.0.0-bin-spark3/tomcat/temp/ec2-user/fcf50c1e-572b-4d08-a337-88117fc7d7d3 2021-10-31 06:39:03,848 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] session.SessionState:753 : Created HDFS directory: /tmp/hive/ec2-user/fcf50c1e-572b-4d08-a337-88117fc7d7d3/_tmp_space.db 2021-10-31 06:39:03,852 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] client.HiveClientImpl:57 : Warehouse location for Hive client (version 2.3.7) is file:/etc/hadoop/spark-warehouse 2021-10-31 06:39:04,655 WARN [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] conf.HiveConf:4116 : HiveConf of name hive.stats.jdbc.timeout does not exist 2021-10-31 06:39:04,656 WARN [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] conf.HiveConf:4116 : HiveConf of name hive.stats.retries.wait does not exist 2021-10-31 06:39:04,656 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.HiveMetaStore:614 : 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore 2021-10-31 06:39:04,683 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.ObjectStore:403 : ObjectStore, initialize called 2021-10-31 06:39:04,782 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] DataNucleus.Persistence:77 : Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 2021-10-31 06:39:04,784 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] DataNucleus.Persistence:77 : Property datanucleus.cache.level2 unknown - will be ignored 2021-10-31 06:39:05,081 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.ObjectStore:526 : Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 2021-10-31 06:39:06,162 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.MetaStoreDirectSql:146 : Using direct SQL, underlying DB is MYSQL 2021-10-31 06:39:06,165 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.ObjectStore:317 : Initialized ObjectStore 2021-10-31 06:39:06,299 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.HiveMetaStore:698 : Added admin role in metastore 2021-10-31 06:39:06,306 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.HiveMetaStore:707 : Added public role in metastore 2021-10-31 06:39:06,331 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.HiveMetaStore:747 : No user is added in admin role, since config is empty 2021-10-31 06:39:06,425 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.HiveMetaStore:781 : 0: get_all_functions 2021-10-31 06:39:06,426 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] HiveMetaStore.audit:309 : ugi=ec2-user ip=unknown-ip-addr cmd=get_all_functions 2021-10-31 06:39:06,449 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.HiveMetaStore:781 : 0: get_database: default 2021-10-31 06:39:06,449 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] HiveMetaStore.audit:309 : ugi=ec2-user ip=unknown-ip-addr cmd=get_database: default 2021-10-31 06:39:06,460 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.HiveMetaStore:781 : 0: get_database: global_temp 2021-10-31 06:39:06,460 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] HiveMetaStore.audit:309 : ugi=ec2-user ip=unknown-ip-addr cmd=get_database: global_temp 2021-10-31 06:39:06,467 WARN [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.ObjectStore:723 : Failed to get database global_temp, returning NoSuchObjectException 2021-10-31 06:39:06,484 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.HiveMetaStore:781 : 0: get_database: fenix 2021-10-31 06:39:06,484 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] HiveMetaStore.audit:309 : ugi=ec2-user ip=unknown-ip-addr cmd=get_database: fenix 2021-10-31 06:39:06,495 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.HiveMetaStore:781 : 0: get_table : db=fenix tbl=fact_audience_cube_input 2021-10-31 06:39:06,496 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] HiveMetaStore.audit:309 : ugi=ec2-user ip=unknown-ip-addr cmd=get_table : db=fenix tbl=fact_audience_cube_input 2021-10-31 06:39:06,585 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] metastore.HiveMetaStore:781 : 0: get_table : db=fenix tbl=fact_audience_cube_input 2021-10-31 06:39:06,585 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] HiveMetaStore.audit:309 : ugi=ec2-user ip=unknown-ip-addr cmd=get_table : db=fenix tbl=fact_audience_cube_input 2021-10-31 06:39:06,890 ERROR [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] application.SparkApplication:95 : The spark job execute failed! java.lang.ClassNotFoundException: Failed to find data source: parquet. Please find packages at http://spark.apache.org/third-party-projects.html at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:689) at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:98) at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:97) at org.apache.spark.sql.execution.datasources.DataSource.providingInstance(DataSource.scala:111) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:349) at org.apache.spark.sql.hive.HiveMetastoreCatalog.$anonfun$convertToLogicalRelation$5(HiveMetastoreCatalog.scala:248) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.sql.hive.HiveMetastoreCatalog.$anonfun$convertToLogicalRelation$4(HiveMetastoreCatalog.scala:238) at org.apache.spark.sql.hive.HiveMetastoreCatalog.withTableCreationLock(HiveMetastoreCatalog.scala:58) at org.apache.spark.sql.hive.HiveMetastoreCatalog.convertToLogicalRelation(HiveMetastoreCatalog.scala:231) at org.apache.spark.sql.hive.HiveMetastoreCatalog.convert(HiveMetastoreCatalog.scala:137) at org.apache.spark.sql.hive.RelationConversions$$anonfun$apply$4.applyOrElse(HiveStrategies.scala:221) at org.apache.spark.sql.hive.RelationConversions$$anonfun$apply$4.applyOrElse(HiveStrategies.scala:208) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$2(AnalysisHelper.scala:108) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:73) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$1(AnalysisHelper.scala:108) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:221) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown(AnalysisHelper.scala:106) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown$(AnalysisHelper.scala:104) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDown(LogicalPlan.scala:29) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$4(AnalysisHelper.scala:113) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$mapChildren$1(TreeNode.scala:407) at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:243) at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:405) at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:358) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$1(AnalysisHelper.scala:113) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:221) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown(AnalysisHelper.scala:106) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown$(AnalysisHelper.scala:104) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDown(LogicalPlan.scala:29) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$4(AnalysisHelper.scala:113) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$mapChildren$1(TreeNode.scala:407) at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:243) at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:405) at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:358) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$1(AnalysisHelper.scala:113) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:221) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown(AnalysisHelper.scala:106) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown$(AnalysisHelper.scala:104) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDown(LogicalPlan.scala:29) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators(AnalysisHelper.scala:73) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators$(AnalysisHelper.scala:72) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:29) at org.apache.spark.sql.hive.RelationConversions.apply(HiveStrategies.scala:208) at org.apache.spark.sql.hive.RelationConversions.apply(HiveStrategies.scala:193) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:216) at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126) at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122) at scala.collection.immutable.List.foldLeft(List.scala:89) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:213) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1$adapted(RuleExecutor.scala:205) at scala.collection.immutable.List.foreach(List.scala:392) at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:205) at org.apache.spark.sql.catalyst.analysis.Analyzer.org$apache$spark$sql$catalyst$analysis$Analyzer$$executeSameContext(Analyzer.scala:196) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:190) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:155) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:183) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:88) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:183) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:174) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:228) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:173) at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:73) at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:143) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772) at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:143) at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:73) at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:71) at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:63) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:98) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96) at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:615) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:610) at org.apache.kylin.engine.spark.source.HiveSource$$anon$1.getSourceData(HiveSource.scala:42) at org.apache.kylin.engine.spark.utils.SparkDataSource$SparkSource.table(SparkDataSource.scala:33) at org.apache.kylin.engine.spark.builder.CreateFlatTable$.org$apache$kylin$engine$spark$builder$CreateFlatTable$$generateTableDataset(CreateFlatTable.scala:131) at org.apache.kylin.engine.spark.builder.CreateFlatTable.generateDataset(CreateFlatTable.scala:47) at org.apache.kylin.engine.spark.job.ParentSourceChooser.getFlatTable(ParentSourceChooser.scala:189) at org.apache.kylin.engine.spark.job.ParentSourceChooser.decideFlatTableSource(ParentSourceChooser.scala:86) at org.apache.kylin.engine.spark.job.ParentSourceChooser.$anonfun$decideSources$1(ParentSourceChooser.scala:71) at org.apache.kylin.engine.spark.job.ParentSourceChooser.$anonfun$decideSources$1$adapted(ParentSourceChooser.scala:66) at scala.collection.Iterator.foreach(Iterator.scala:941) at scala.collection.Iterator.foreach$(Iterator.scala:941) at scala.collection.AbstractIterator.foreach(Iterator.scala:1429) at scala.collection.IterableLike.foreach(IterableLike.scala:74) at scala.collection.IterableLike.foreach$(IterableLike.scala:73) at scala.collection.AbstractIterable.foreach(Iterable.scala:56) at org.apache.kylin.engine.spark.job.ParentSourceChooser.decideSources(ParentSourceChooser.scala:66) at org.apache.kylin.engine.spark.job.ResourceDetectBeforeCubingJob.doExecute(ResourceDetectBeforeCubingJob.java:67) at org.apache.kylin.engine.spark.application.SparkApplication.execute(SparkApplication.java:304) at org.apache.kylin.engine.spark.application.SparkApplication.execute(SparkApplication.java:93) at org.apache.kylin.engine.spark.job.ResourceDetectBeforeCubingJob.main(ResourceDetectBeforeCubingJob.java:106) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.kylin.engine.spark.job.NSparkExecutable.runLocalMode(NSparkExecutable.java:451) at org.apache.kylin.engine.spark.job.NSparkExecutable.doWork(NSparkExecutable.java:161) at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:206) at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:94) at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:206) at org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:113) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassNotFoundException: parquet.DefaultSource at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1951) at org.apache.kylin.spark.classloader.TomcatClassLoader.loadClass(TomcatClassLoader.java:114) at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1794) at org.apache.spark.sql.execution.datasources.DataSource$.$anonfun$lookupDataSource$5(DataSource.scala:663) at scala.util.Try$.apply(Try.scala:213) at org.apache.spark.sql.execution.datasources.DataSource$.$anonfun$lookupDataSource$4(DataSource.scala:663) at scala.util.Failure.orElse(Try.scala:224) at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:663) ... 107 more 2021-10-31 06:39:06,902 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.AbstractExecutable:417 : The state of job is:RUNNING 2021-10-31 06:39:06,926 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.ExecutableManager:676 : job id:e5d7e027-7055-4c07-a0b3-f44b54f45b70-00 from RUNNING to ERROR 2021-10-31 06:39:06,938 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.ExecutableManager:698 : need kill e5d7e027-7055-4c07-a0b3-f44b54f45b70-00, from RUNNING to ERROR 2021-10-31 06:39:06,942 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.ExecutableManager:721 : Update JobOutput To HDFS for e5d7e027-7055-4c07-a0b3-f44b54f45b70-00 to s3a://dubizzle-data/fenix/kylin/kylin4/kylin_s3_first_project/spark_logs/driver/e5d7e027-7055-4c07-a0b3-f44b54f45b70-00/execute_output.json [11377] 2021-10-31 06:39:07,227 ERROR [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.AbstractExecutable:210 : error running Executable: NSparkCubingJob{id=e5d7e027-7055-4c07-a0b3-f44b54f45b70, name=BUILD CUBE - Sample1 - 20200201000000_20200501000000 - UTC 2021-10-31 06:38:37, state=RUNNING} 2021-10-31 06:39:07,229 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.AbstractExecutable:417 : The state of job is:RUNNING 2021-10-31 06:39:07,244 DEBUG [pool-3-thread-1] cachesync.Broadcaster:119 : Servers in the cluster: [localhost:7070] 2021-10-31 06:39:07,245 DEBUG [pool-3-thread-1] cachesync.Broadcaster:129 : Announcing new broadcast to all: BroadcastEvent{entity=execute_output, event=update, cacheKey=e5d7e027-7055-4c07-a0b3-f44b54f45b70} 2021-10-31 06:39:07,247 INFO [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.ExecutableManager:676 : job id:e5d7e027-7055-4c07-a0b3-f44b54f45b70 from RUNNING to ERROR 2021-10-31 06:39:07,250 DEBUG [http-bio-7070-exec-3] cachesync.Broadcaster:267 : Broadcasting UPDATE, execute_output, e5d7e027-7055-4c07-a0b3-f44b54f45b70 2021-10-31 06:39:07,252 DEBUG [http-bio-7070-exec-3] cachesync.Broadcaster:301 : Done broadcasting UPDATE, execute_output, e5d7e027-7055-4c07-a0b3-f44b54f45b70 2021-10-31 06:39:07,260 DEBUG [pool-3-thread-1] cachesync.Broadcaster:119 : Servers in the cluster: [localhost:7070] 2021-10-31 06:39:07,260 DEBUG [pool-3-thread-1] cachesync.Broadcaster:129 : Announcing new broadcast to all: BroadcastEvent{entity=execute_output, event=update, cacheKey=e5d7e027-7055-4c07-a0b3-f44b54f45b70} 2021-10-31 06:39:07,261 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.ExecutableManager:698 : need kill e5d7e027-7055-4c07-a0b3-f44b54f45b70, from RUNNING to ERROR 2021-10-31 06:39:07,261 DEBUG [Scheduler 140255650 Job e5d7e027-7055-4c07-a0b3-f44b54f45b70-80] execution.AbstractExecutable:365 : no need to send email, user list is empty 2021-10-31 06:39:07,264 DEBUG [http-bio-7070-exec-8] cachesync.Broadcaster:267 : Broadcasting UPDATE, execute_output, e5d7e027-7055-4c07-a0b3-f44b54f45b70 2021-10-31 06:39:07,267 DEBUG [http-bio-7070-exec-8] cachesync.Broadcaster:301 : Done broadcasting UPDATE, execute_output, e5d7e027-7055-4c07-a0b3-f44b54f45b70 2021-10-31 06:39:07,270 ERROR [pool-9-thread-1] threadpool.DefaultScheduler:115 : ExecuteException job:e5d7e027-7055-4c07-a0b3-f44b54f45b70 org.apache.kylin.job.exception.ExecuteException: org.apache.kylin.job.exception.ExecuteException: java.lang.reflect.InvocationTargetException at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:225) at org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:113) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.kylin.job.exception.ExecuteException: java.lang.reflect.InvocationTargetException at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:225) at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:94) at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:206) ... 4 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.kylin.engine.spark.job.NSparkExecutable.runLocalMode(NSparkExecutable.java:451) at org.apache.kylin.engine.spark.job.NSparkExecutable.doWork(NSparkExecutable.java:161) at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:206) ... 6 more Caused by: java.lang.RuntimeException: Error execute org.apache.kylin.engine.spark.job.ResourceDetectBeforeCubingJob at org.apache.kylin.engine.spark.application.SparkApplication.execute(SparkApplication.java:96) at org.apache.kylin.engine.spark.job.ResourceDetectBeforeCubingJob.main(ResourceDetectBeforeCubingJob.java:106) ... 13 more Caused by: java.lang.ClassNotFoundException: Failed to find data source: parquet. Please find packages at http://spark.apache.org/third-party-projects.html at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:689) at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:98) at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:97) at org.apache.spark.sql.execution.datasources.DataSource.providingInstance(DataSource.scala:111) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:349) at org.apache.spark.sql.hive.HiveMetastoreCatalog.$anonfun$convertToLogicalRelation$5(HiveMetastoreCatalog.scala:248) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.sql.hive.HiveMetastoreCatalog.$anonfun$convertToLogicalRelation$4(HiveMetastoreCatalog.scala:238) at org.apache.spark.sql.hive.HiveMetastoreCatalog.withTableCreationLock(HiveMetastoreCatalog.scala:58) at org.apache.spark.sql.hive.HiveMetastoreCatalog.convertToLogicalRelation(HiveMetastoreCatalog.scala:231) at org.apache.spark.sql.hive.HiveMetastoreCatalog.convert(HiveMetastoreCatalog.scala:137) at org.apache.spark.sql.hive.RelationConversions$$anonfun$apply$4.applyOrElse(HiveStrategies.scala:221) at org.apache.spark.sql.hive.RelationConversions$$anonfun$apply$4.applyOrElse(HiveStrategies.scala:208) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$2(AnalysisHelper.scala:108) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:73) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$1(AnalysisHelper.scala:108) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:221) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown(AnalysisHelper.scala:106) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown$(AnalysisHelper.scala:104) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDown(LogicalPlan.scala:29) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$4(AnalysisHelper.scala:113) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$mapChildren$1(TreeNode.scala:407) at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:243) at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:405) at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:358) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$1(AnalysisHelper.scala:113) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:221) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown(AnalysisHelper.scala:106) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown$(AnalysisHelper.scala:104) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDown(LogicalPlan.scala:29) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$4(AnalysisHelper.scala:113) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$mapChildren$1(TreeNode.scala:407) at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:243) at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:405) at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:358) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$1(AnalysisHelper.scala:113) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:221) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown(AnalysisHelper.scala:106) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown$(AnalysisHelper.scala:104) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDown(LogicalPlan.scala:29) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators(AnalysisHelper.scala:73) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators$(AnalysisHelper.scala:72) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:29) at org.apache.spark.sql.hive.RelationConversions.apply(HiveStrategies.scala:208) at org.apache.spark.sql.hive.RelationConversions.apply(HiveStrategies.scala:193) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:216) at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126) at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122) at scala.collection.immutable.List.foldLeft(List.scala:89) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:213) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1$adapted(RuleExecutor.scala:205) at scala.collection.immutable.List.foreach(List.scala:392) at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:205) at org.apache.spark.sql.catalyst.analysis.Analyzer.org$apache$spark$sql$catalyst$analysis$Analyzer$$executeSameContext(Analyzer.scala:196) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:190) at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:155) at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:183) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:88) at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:183) at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:174) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:228) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:173) at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:73) at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111) at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:143) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772) at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:143) at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:73) at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:71) at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:63) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:98) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96) at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:615) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:610) at org.apache.kylin.engine.spark.source.HiveSource$$anon$1.getSourceData(HiveSource.scala:42) at org.apache.kylin.engine.spark.utils.SparkDataSource$SparkSource.table(SparkDataSource.scala:33) at org.apache.kylin.engine.spark.builder.CreateFlatTable$.org$apache$kylin$engine$spark$builder$CreateFlatTable$$generateTableDataset(CreateFlatTable.scala:131) at org.apache.kylin.engine.spark.builder.CreateFlatTable.generateDataset(CreateFlatTable.scala:47) at org.apache.kylin.engine.spark.job.ParentSourceChooser.getFlatTable(ParentSourceChooser.scala:189) at org.apache.kylin.engine.spark.job.ParentSourceChooser.decideFlatTableSource(ParentSourceChooser.scala:86) at org.apache.kylin.engine.spark.job.ParentSourceChooser.$anonfun$decideSources$1(ParentSourceChooser.scala:71) at org.apache.kylin.engine.spark.job.ParentSourceChooser.$anonfun$decideSources$1$adapted(ParentSourceChooser.scala:66) at scala.collection.Iterator.foreach(Iterator.scala:941) at scala.collection.Iterator.foreach$(Iterator.scala:941) at scala.collection.AbstractIterator.foreach(Iterator.scala:1429) at scala.collection.IterableLike.foreach(IterableLike.scala:74) at scala.collection.IterableLike.foreach$(IterableLike.scala:73) at scala.collection.AbstractIterable.foreach(Iterable.scala:56) at org.apache.kylin.engine.spark.job.ParentSourceChooser.decideSources(ParentSourceChooser.scala:66) at org.apache.kylin.engine.spark.job.ResourceDetectBeforeCubingJob.doExecute(ResourceDetectBeforeCubingJob.java:67) at org.apache.kylin.engine.spark.application.SparkApplication.execute(SparkApplication.java:304) at org.apache.kylin.engine.spark.application.SparkApplication.execute(SparkApplication.java:93) ... 14 more Caused by: java.lang.ClassNotFoundException: parquet.DefaultSource at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1951) at org.apache.kylin.spark.classloader.TomcatClassLoader.loadClass(TomcatClassLoader.java:114) at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1794) at org.apache.spark.sql.execution.datasources.DataSource$.$anonfun$lookupDataSource$5(DataSource.scala:663) at scala.util.Try$.apply(Try.scala:213) at org.apache.spark.sql.execution.datasources.DataSource$.$anonfun$lookupDataSource$4(DataSource.scala:663) at scala.util.Failure.orElse(Try.scala:224) at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:663) ... 107 more 2021-10-31 06:39:07,271 INFO [FetcherRunner 1752764464-37] threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 0 actual running, 0 stopped, 0 ready, 0 already succeed, 1 error, 0 discarded, 0 others 2021-10-31 06:39:15,217 INFO [FetcherRunner 1752764464-37] threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 0 actual running, 0 stopped, 0 ready, 0 already succeed, 1 error, 0 discarded, 0 others 2021-10-31 06:39:42,343 DEBUG [BadQueryDetector] service.BadQueryDetector:148 : Detect bad query. 2021-10-31 06:39:45,217 INFO [FetcherRunner 1752764464-37] threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 0 actual running, 0 stopped, 0 ready, 0 already succeed, 1 error, 0 discarded, 0 others