Got an error while running a code through Airflow DAG.
Exception while running an ETL job on an External table created on Hive stored as parquet in S3 with AWS Glue as metastore. Here's the error message:
java.lang.RuntimeException: Caught Hive MetaException attempting to get partition metadata by filter from Hive. You can set the Spark configuration setting spark.sql.hive.manageFilesourcePartitions to false to work around this problem, however this will result in degraded performance. Please report a bug: https://issues.apache.org/jira/browse/SPARK |
Caused by: MetaException(message:Unknown exception occurred. (Service: AWSGlue; Status Code: 500; Error Code: InternalServiceException; Request ID: 73267997-1795-45a3-965f-8bb2a6b7b3ac))
Exact issue occurred while running on Databricks notebook as well. Screenshot attached for both cases.