Uploaded image for project: 'Apache Hudi'
  1. Apache Hudi
  2. HUDI-3781

spark delete sql can't delete record

    XMLWordPrintableJSON

Details

    Description

      create a table and set hoodie.datasource.write.operation upsert 

      when I use sql to delete, the delete operation key will be overwrite by hoodie.datasource.write.operation from table or env

       

      withSparkConf(sparkSession, hoodieCatalogTable.catalogProperties) {
        Map(
          "path" -> path,
          RECORDKEY_FIELD.key -> hoodieCatalogTable.primaryKeys.mkString(","),
          TBL_NAME.key -> tableConfig.getTableName,
          HIVE_STYLE_PARTITIONING.key -> tableConfig.getHiveStylePartitioningEnable,
          URL_ENCODE_PARTITIONING.key -> tableConfig.getUrlEncodePartitioning,
          KEYGENERATOR_CLASS_NAME.key -> classOf[SqlKeyGenerator].getCanonicalName,
          SqlKeyGenerator.ORIGIN_KEYGEN_CLASS_NAME -> tableConfig.getKeyGeneratorClassName,
          OPERATION.key -> DataSourceWriteOptions.DELETE_OPERATION_OPT_VAL,
          PARTITIONPATH_FIELD.key -> tableConfig.getPartitionFieldProp,
          HiveSyncConfig.HIVE_SYNC_MODE.key -> HiveSyncMode.HMS.name(),
          HiveSyncConfig.HIVE_SUPPORT_TIMESTAMP_TYPE.key -> "true",
          HoodieWriteConfig.DELETE_PARALLELISM_VALUE.key -> "200",
          SqlKeyGenerator.PARTITION_SCHEMA -> partitionSchema.toDDL
        )
      } 

      Attachments

        Issue Links

          Activity

            People

              KnightChess KnightChess
              KnightChess KnightChess
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: