The current business project we are enabling has been built completely on GCP components with composer with airflow being one of the key process. We have built various data pipelines using airflow for multiple work-streams where data is being ingested from gcs bucket to Big query.
Based on the recent updates on Google BQ infra end, there seems to be some tightened validations on UTF-8 characters which has resulted in mutiple failures of our existing business process.
On further analysis we found out that while ingesting data to BQ from a Google bucket the encoding needs to be explicitly specified going forward but the below operator currently doesn't supply any params to specify explicit encoding
Could someone please treat this as a priority and help us with a fix to bring us back in BAU mode