Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 1.3.0
    • Component/s: Table API & SQL
    • Labels:
      None

      Description

      The upcoming Calcite release 1.11 has a lot of stability fixes and new features. We should update it for the Table API.

      E.g. we can hopefully merge FLINK-4864

        Issue Links

          Activity

          Hide
          jark Jark Wu added a comment -

          Unfortunately, FLINK-4864 is not only blocked by CALCITE-1461 which is fixed in Calcite 1.11, but also the Scala shade problem

          I will work on this issue. The 1.11 RC is still on voting.

          Show
          jark Jark Wu added a comment - Unfortunately, FLINK-4864 is not only blocked by CALCITE-1461 which is fixed in Calcite 1.11, but also the Scala shade problem I will work on this issue. The 1.11 RC is still on voting.
          Hide
          jark Jark Wu added a comment -

          Calcite 1.11 has been released. I will start to work on this issue in these days.

          Show
          jark Jark Wu added a comment - Calcite 1.11 has been released. I will start to work on this issue in these days.
          Hide
          wheat9 Haohui Mai added a comment -

          Jark Wu, are there any progresses on this jira? We are interested in this as well. If you're busy I'm happy to help on this.

          Show
          wheat9 Haohui Mai added a comment - Jark Wu , are there any progresses on this jira? We are interested in this as well. If you're busy I'm happy to help on this.
          Hide
          jark Jark Wu added a comment -

          Hi Haohui Mai, sorry for the delay. Actually, I had some problems when working on this issue.

          I will try to solve the problem today, and update my progress here.

          Show
          jark Jark Wu added a comment - Hi Haohui Mai , sorry for the delay. Actually, I had some problems when working on this issue. I will try to solve the problem today, and update my progress here.
          Hide
          githubbot ASF GitHub Bot added a comment -

          GitHub user wuchong opened a pull request:

          https://github.com/apache/flink/pull/3338

          FLINK-5414 [table] Bump up Calcite version to 1.11

          Thanks for contributing to Apache Flink. Before you open your pull request, please take the following check list into consideration.
          If your changes take all of the items into account, feel free to open your pull request. For more information and/or questions please refer to the [How To Contribute guide](http://flink.apache.org/how-to-contribute.html).
          In addition to going through the list, please provide a meaningful description of your changes.

          • [ ] General
          • The pull request references the related JIRA issue ("[FLINK-XXX] Jira title text")
          • The pull request addresses only one issue
          • Each commit in the PR has a meaningful commit message (including the JIRA id)
          • [ ] Documentation
          • Documentation has been added for new functionality
          • Old documentation affected by the pull request has been updated
          • JavaDoc for public methods has been added
          • [ ] Tests & Build
          • Functionality added by the pull request is covered by tests
          • `mvn clean verify` has been executed successfully locally or a Travis build has passed

          This PR upgrade Calcite to version 1.11. But there are a lot of compatibility issues. I fixed them. Correct me if the way of fixing is wrong.

          You can merge this pull request into a Git repository by running:

          $ git pull https://github.com/wuchong/flink calcite-FLINK-5414

          Alternatively you can review and apply these changes as the patch at:

          https://github.com/apache/flink/pull/3338.patch

          To close this pull request, make a commit to your master/trunk branch
          with (at least) the following in the commit message:

          This closes #3338


          commit d3d4c914325c1591c1bd5b0e5081df020cda0507
          Author: Jark Wu <wuchong.wc@alibaba-inc.com>
          Date: 2017-02-17T05:17:43Z

          FLINK-5414 [table] Bump up Calcite version to 1.11


          Show
          githubbot ASF GitHub Bot added a comment - GitHub user wuchong opened a pull request: https://github.com/apache/flink/pull/3338 FLINK-5414 [table] Bump up Calcite version to 1.11 Thanks for contributing to Apache Flink. Before you open your pull request, please take the following check list into consideration. If your changes take all of the items into account, feel free to open your pull request. For more information and/or questions please refer to the [How To Contribute guide] ( http://flink.apache.org/how-to-contribute.html ). In addition to going through the list, please provide a meaningful description of your changes. [ ] General The pull request references the related JIRA issue (" [FLINK-XXX] Jira title text") The pull request addresses only one issue Each commit in the PR has a meaningful commit message (including the JIRA id) [ ] Documentation Documentation has been added for new functionality Old documentation affected by the pull request has been updated JavaDoc for public methods has been added [ ] Tests & Build Functionality added by the pull request is covered by tests `mvn clean verify` has been executed successfully locally or a Travis build has passed This PR upgrade Calcite to version 1.11. But there are a lot of compatibility issues. I fixed them. Correct me if the way of fixing is wrong. You can merge this pull request into a Git repository by running: $ git pull https://github.com/wuchong/flink calcite- FLINK-5414 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/flink/pull/3338.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #3338 commit d3d4c914325c1591c1bd5b0e5081df020cda0507 Author: Jark Wu <wuchong.wc@alibaba-inc.com> Date: 2017-02-17T05:17:43Z FLINK-5414 [table] Bump up Calcite version to 1.11
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user fhueske commented on a diff in the pull request:

          https://github.com/apache/flink/pull/3338#discussion_r101728056

          — Diff: flink-libraries/flink-table/src/main/scala/org/apache/flink/table/codegen/ExpressionReducer.scala —
          @@ -106,8 +106,16 @@ class ExpressionReducer(config: TableConfig)
          case SqlTypeName.ANY | SqlTypeName.ROW | SqlTypeName.ARRAY =>
          reducedValues.add(unreduced)
          case _ =>
          + val reducedValue = reduced.getField(reducedIdx)
          + // RexBuilder handle boolean literal incorrectly, convert it into BigDecimal manually
          — End diff –

          Can you check the comment? Shouldn't "boolean" be "double"?

          Show
          githubbot ASF GitHub Bot added a comment - Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/3338#discussion_r101728056 — Diff: flink-libraries/flink-table/src/main/scala/org/apache/flink/table/codegen/ExpressionReducer.scala — @@ -106,8 +106,16 @@ class ExpressionReducer(config: TableConfig) case SqlTypeName.ANY | SqlTypeName.ROW | SqlTypeName.ARRAY => reducedValues.add(unreduced) case _ => + val reducedValue = reduced.getField(reducedIdx) + // RexBuilder handle boolean literal incorrectly, convert it into BigDecimal manually — End diff – Can you check the comment? Shouldn't "boolean" be "double"?
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user fhueske commented on the issue:

          https://github.com/apache/flink/pull/3338

          @twalthr, you might want to have a look at the `FlinkTypeFactory` changes

          Show
          githubbot ASF GitHub Bot added a comment - Github user fhueske commented on the issue: https://github.com/apache/flink/pull/3338 @twalthr, you might want to have a look at the `FlinkTypeFactory` changes
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user wuchong commented on a diff in the pull request:

          https://github.com/apache/flink/pull/3338#discussion_r101750708

          — Diff: flink-libraries/flink-table/src/main/scala/org/apache/flink/table/codegen/ExpressionReducer.scala —
          @@ -106,8 +106,16 @@ class ExpressionReducer(config: TableConfig)
          case SqlTypeName.ANY | SqlTypeName.ROW | SqlTypeName.ARRAY =>
          reducedValues.add(unreduced)
          case _ =>
          + val reducedValue = reduced.getField(reducedIdx)
          + // RexBuilder handle boolean literal incorrectly, convert it into BigDecimal manually
          — End diff –

          Yes, it should be `double`

          Show
          githubbot ASF GitHub Bot added a comment - Github user wuchong commented on a diff in the pull request: https://github.com/apache/flink/pull/3338#discussion_r101750708 — Diff: flink-libraries/flink-table/src/main/scala/org/apache/flink/table/codegen/ExpressionReducer.scala — @@ -106,8 +106,16 @@ class ExpressionReducer(config: TableConfig) case SqlTypeName.ANY | SqlTypeName.ROW | SqlTypeName.ARRAY => reducedValues.add(unreduced) case _ => + val reducedValue = reduced.getField(reducedIdx) + // RexBuilder handle boolean literal incorrectly, convert it into BigDecimal manually — End diff – Yes, it should be `double`
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user haohui commented on the issue:

          https://github.com/apache/flink/pull/3338

          Looks good to me overall. One question – I wonder, does it mean that all array types become nullable after this change?

          Show
          githubbot ASF GitHub Bot added a comment - Github user haohui commented on the issue: https://github.com/apache/flink/pull/3338 Looks good to me overall. One question – I wonder, does it mean that all array types become nullable after this change?
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user wuchong commented on the issue:

          https://github.com/apache/flink/pull/3338

          Hi @fhueske , yes, Calcite forces an Calc after each aggregate that only renames fields, because we rename every aggregates in Table API which is not necessary. I changed the logic of getting projections on aggregates to only rename the duplicate aggregates. And that works good, no more Calc appended.

          Hi @haohui , the ArrayRelDataType is still NOT NULL. I reverted [that line](https://github.com/apache/flink/blob/master/flink-libraries/flink-table/src/main/scala/org/apache/flink/table/calcite/FlinkTypeFactory.scala#L121) which is not need to be changed in this PR.

          Cheers,
          Jark Wu

          Show
          githubbot ASF GitHub Bot added a comment - Github user wuchong commented on the issue: https://github.com/apache/flink/pull/3338 Hi @fhueske , yes, Calcite forces an Calc after each aggregate that only renames fields, because we rename every aggregates in Table API which is not necessary. I changed the logic of getting projections on aggregates to only rename the duplicate aggregates. And that works good, no more Calc appended. Hi @haohui , the ArrayRelDataType is still NOT NULL. I reverted [that line] ( https://github.com/apache/flink/blob/master/flink-libraries/flink-table/src/main/scala/org/apache/flink/table/calcite/FlinkTypeFactory.scala#L121 ) which is not need to be changed in this PR. Cheers, Jark Wu
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user fhueske commented on the issue:

          https://github.com/apache/flink/pull/3338

          Thanks for the update @wuchong.
          PR looks good to me. Feel free to merge

          Show
          githubbot ASF GitHub Bot added a comment - Github user fhueske commented on the issue: https://github.com/apache/flink/pull/3338 Thanks for the update @wuchong. PR looks good to me. Feel free to merge
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user twalthr commented on a diff in the pull request:

          https://github.com/apache/flink/pull/3338#discussion_r101977305

          — Diff: flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/ProjectionTranslator.scala —
          @@ -108,62 +109,73 @@ object ProjectionTranslator {
          tableEnv: TableEnvironment,
          aggNames: Map[Expression, String],
          propNames: Map[Expression, String]): Seq[NamedExpression] =

          { - exprs.map(replaceAggregationsAndProperties(_, tableEnv, aggNames, propNames)) - .map(UnresolvedAlias) - }
          • private def replaceAggregationsAndProperties(
            + val projectNames: mutable.HashSet[String] = new mutable.HashSet[String]
            +
            + def replaceAggregationsAndProperties(
              • End diff –

          Can you rename this or the outer function? Having two functions with the same names is confusing.

          Show
          githubbot ASF GitHub Bot added a comment - Github user twalthr commented on a diff in the pull request: https://github.com/apache/flink/pull/3338#discussion_r101977305 — Diff: flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/ProjectionTranslator.scala — @@ -108,62 +109,73 @@ object ProjectionTranslator { tableEnv: TableEnvironment, aggNames: Map [Expression, String] , propNames: Map [Expression, String] ): Seq [NamedExpression] = { - exprs.map(replaceAggregationsAndProperties(_, tableEnv, aggNames, propNames)) - .map(UnresolvedAlias) - } private def replaceAggregationsAndProperties( + val projectNames: mutable.HashSet [String] = new mutable.HashSet [String] + + def replaceAggregationsAndProperties( End diff – Can you rename this or the outer function? Having two functions with the same names is confusing.
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user twalthr commented on a diff in the pull request:

          https://github.com/apache/flink/pull/3338#discussion_r101977849

          — Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala —
          @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
          "DataSetCalc",
          batchTableNode(0),
          term("select",

          • "13 AS _c0",
            + "CAST(13) AS _c0",
              • End diff –

          Do you know why there are so many unnecessary casts? Is it because of the different nullability?

          Show
          githubbot ASF GitHub Bot added a comment - Github user twalthr commented on a diff in the pull request: https://github.com/apache/flink/pull/3338#discussion_r101977849 — Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala — @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase { "DataSetCalc", batchTableNode(0), term("select", "13 AS _c0", + "CAST(13) AS _c0", End diff – Do you know why there are so many unnecessary casts? Is it because of the different nullability?
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user wuchong commented on a diff in the pull request:

          https://github.com/apache/flink/pull/3338#discussion_r102040448

          — Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala —
          @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
          "DataSetCalc",
          batchTableNode(0),
          term("select",

          • "13 AS _c0",
            + "CAST(13) AS _c0",
              • End diff –

          Yes, because I changed the default nullable to `true`, but the reduced constant is `NOT NULL`, so a `CAST` is here. Do you have any ideas to fix this?

          The default nullable changed to `true` is because `UserDefinedScalarFunctionTest.testResults` and `ArrayTypeTest.testArrayLiterals` fail.

          Show
          githubbot ASF GitHub Bot added a comment - Github user wuchong commented on a diff in the pull request: https://github.com/apache/flink/pull/3338#discussion_r102040448 — Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala — @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase { "DataSetCalc", batchTableNode(0), term("select", "13 AS _c0", + "CAST(13) AS _c0", End diff – Yes, because I changed the default nullable to `true`, but the reduced constant is `NOT NULL`, so a `CAST` is here. Do you have any ideas to fix this? The default nullable changed to `true` is because `UserDefinedScalarFunctionTest.testResults` and `ArrayTypeTest.testArrayLiterals` fail.
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user twalthr commented on a diff in the pull request:

          https://github.com/apache/flink/pull/3338#discussion_r102046284

          — Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala —
          @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
          "DataSetCalc",
          batchTableNode(0),
          term("select",

          • "13 AS _c0",
            + "CAST(13) AS _c0",
              • End diff –

          I will have a look at it again. In general, the only real solution is finally fix FLINK-5177.

          Show
          githubbot ASF GitHub Bot added a comment - Github user twalthr commented on a diff in the pull request: https://github.com/apache/flink/pull/3338#discussion_r102046284 — Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala — @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase { "DataSetCalc", batchTableNode(0), term("select", "13 AS _c0", + "CAST(13) AS _c0", End diff – I will have a look at it again. In general, the only real solution is finally fix FLINK-5177 .
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user haohui commented on a diff in the pull request:

          https://github.com/apache/flink/pull/3338#discussion_r102117236

          — Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala —
          @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
          "DataSetCalc",
          batchTableNode(0),
          term("select",

          • "13 AS _c0",
            + "CAST(13) AS _c0",
              • End diff –

          Is it possible to not changing the default nullability while adopting Calcite 1.11? Let me try it out as well.

          Show
          githubbot ASF GitHub Bot added a comment - Github user haohui commented on a diff in the pull request: https://github.com/apache/flink/pull/3338#discussion_r102117236 — Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala — @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase { "DataSetCalc", batchTableNode(0), term("select", "13 AS _c0", + "CAST(13) AS _c0", End diff – Is it possible to not changing the default nullability while adopting Calcite 1.11? Let me try it out as well.
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user haohui commented on a diff in the pull request:

          https://github.com/apache/flink/pull/3338#discussion_r102125031

          — Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala —
          @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
          "DataSetCalc",
          batchTableNode(0),
          term("select",

          • "13 AS _c0",
            + "CAST(13) AS _c0",
              • End diff –

          Just played around a little bit. I think the problem is that the advanced types are not properly canonized. Using the following diff can pass all tests in `ArrayTypeTest`:

          ```
          — a/flink-libraries/flink-table/src/main/scala/org/apache/flink/table/calcite/FlinkTypeFactory.scala
          +++ b/flink-libraries/flink-table/src/main/scala/org/apache/flink/table/calcite/FlinkTypeFactory.scala
          @@ -133,12 +133,18 @@ class FlinkTypeFactory(typeSystem: RelDataTypeSystem) extends JavaTypeFactoryImp
          override def createTypeWithNullability(
          relDataType: RelDataType,
          nullable: Boolean)

          • : RelDataType = relDataType match {
          • case composite: CompositeRelDataType =>
          • // at the moment we do not care about nullability
          • composite
          • case _ =>
          • super.createTypeWithNullability(relDataType, nullable)
            + : RelDataType =
            Unknown macro: { + val t = relDataType match { + case composite: CompositeRelDataType => + // at the moment we do not care about nullability + composite + case array: ArrayRelDataType => + val elementType = canonize(createTypeWithNullability(array.getComponentType, nullable)) + new ArrayRelDataType(array.typeInfo, elementType, nullable) + case _ => + super.createTypeWithNullability(relDataType, nullable) + } + canonize(t) }

            }
            ```

          GroupWindowTest is still failing as it misses an identity projection. I'm wondering why `ProjectRemoveRule.INSTANCE` did not kick in...

          Show
          githubbot ASF GitHub Bot added a comment - Github user haohui commented on a diff in the pull request: https://github.com/apache/flink/pull/3338#discussion_r102125031 — Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala — @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase { "DataSetCalc", batchTableNode(0), term("select", "13 AS _c0", + "CAST(13) AS _c0", End diff – Just played around a little bit. I think the problem is that the advanced types are not properly canonized. Using the following diff can pass all tests in `ArrayTypeTest`: ``` — a/flink-libraries/flink-table/src/main/scala/org/apache/flink/table/calcite/FlinkTypeFactory.scala +++ b/flink-libraries/flink-table/src/main/scala/org/apache/flink/table/calcite/FlinkTypeFactory.scala @@ -133,12 +133,18 @@ class FlinkTypeFactory(typeSystem: RelDataTypeSystem) extends JavaTypeFactoryImp override def createTypeWithNullability( relDataType: RelDataType, nullable: Boolean) : RelDataType = relDataType match { case composite: CompositeRelDataType => // at the moment we do not care about nullability composite case _ => super.createTypeWithNullability(relDataType, nullable) + : RelDataType = Unknown macro: { + val t = relDataType match { + case composite: CompositeRelDataType => + // at the moment we do not care about nullability + composite + case array: ArrayRelDataType => + val elementType = canonize(createTypeWithNullability(array.getComponentType, nullable)) + new ArrayRelDataType(array.typeInfo, elementType, nullable) + case _ => + super.createTypeWithNullability(relDataType, nullable) + } + canonize(t) } } ``` GroupWindowTest is still failing as it misses an identity projection. I'm wondering why `ProjectRemoveRule.INSTANCE` did not kick in...
          Hide
          githubbot ASF GitHub Bot added a comment -

          GitHub user haohui opened a pull request:

          https://github.com/apache/flink/pull/3426

          FLINK-5414 [table] Bump up Calcite version to 1.11

          This PR resembles #3338 except that it canonizes the nullable types.

          @wuchong can you please take a look?

          You can merge this pull request into a Git repository by running:

          $ git pull https://github.com/haohui/flink FLINK-5414

          Alternatively you can review and apply these changes as the patch at:

          https://github.com/apache/flink/pull/3426.patch

          To close this pull request, make a commit to your master/trunk branch
          with (at least) the following in the commit message:

          This closes #3426


          commit 339f310dec67d40cd4ed4ecbabf09a0b0dcba518
          Author: Haohui Mai <wheat9@apache.org>
          Date: 2017-02-27T22:24:08Z

          FLINK-5414 [table] Bump up Calcite version to 1.11. (Jark Wu and Haohui Mai)


          Show
          githubbot ASF GitHub Bot added a comment - GitHub user haohui opened a pull request: https://github.com/apache/flink/pull/3426 FLINK-5414 [table] Bump up Calcite version to 1.11 This PR resembles #3338 except that it canonizes the nullable types. @wuchong can you please take a look? You can merge this pull request into a Git repository by running: $ git pull https://github.com/haohui/flink FLINK-5414 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/flink/pull/3426.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #3426 commit 339f310dec67d40cd4ed4ecbabf09a0b0dcba518 Author: Haohui Mai <wheat9@apache.org> Date: 2017-02-27T22:24:08Z FLINK-5414 [table] Bump up Calcite version to 1.11. (Jark Wu and Haohui Mai)
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user wuchong commented on the issue:

          https://github.com/apache/flink/pull/3426

          Hi @haohui , the code looks very good to me. Thanks for investigating this, waiting for the CI pass.

          @twalthr , I think you would like to have a look.

          Show
          githubbot ASF GitHub Bot added a comment - Github user wuchong commented on the issue: https://github.com/apache/flink/pull/3426 Hi @haohui , the code looks very good to me. Thanks for investigating this, waiting for the CI pass. @twalthr , I think you would like to have a look.
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user twalthr commented on the issue:

          https://github.com/apache/flink/pull/3426

          Thanks @wuchong and @haohui. I will look at it today.

          Show
          githubbot ASF GitHub Bot added a comment - Github user twalthr commented on the issue: https://github.com/apache/flink/pull/3426 Thanks @wuchong and @haohui. I will look at it today.
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user twalthr commented on the issue:

          https://github.com/apache/flink/pull/3426

          @haohui your build is failing. Can you have a look at it again?

          Show
          githubbot ASF GitHub Bot added a comment - Github user twalthr commented on the issue: https://github.com/apache/flink/pull/3426 @haohui your build is failing. Can you have a look at it again?
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user haohui commented on the issue:

          https://github.com/apache/flink/pull/3426

          Sorry stuck with something. Will update the PR in a day or two.

          Show
          githubbot ASF GitHub Bot added a comment - Github user haohui commented on the issue: https://github.com/apache/flink/pull/3426 Sorry stuck with something. Will update the PR in a day or two.
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user haohui commented on the issue:

          https://github.com/apache/flink/pull/3426

          Fix the unit tests.

          There are two additional changes:

          1. There are precision differences when converting `double` to `BigDecimal`. Fix the unit tests.
          2. When registering UDFs Flink needs to distinguish nullable and non-nullable types. Patched `UserDefinedFunctionUtils`. We need a solution like FLINK-5177 to handle these cases systematically.

          Show
          githubbot ASF GitHub Bot added a comment - Github user haohui commented on the issue: https://github.com/apache/flink/pull/3426 Fix the unit tests. There are two additional changes: 1. There are precision differences when converting `double` to `BigDecimal`. Fix the unit tests. 2. When registering UDFs Flink needs to distinguish nullable and non-nullable types. Patched `UserDefinedFunctionUtils`. We need a solution like FLINK-5177 to handle these cases systematically.
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user haohui commented on the issue:

          https://github.com/apache/flink/pull/3426

          @twalthr can you please take another look? Thanks.

          Show
          githubbot ASF GitHub Bot added a comment - Github user haohui commented on the issue: https://github.com/apache/flink/pull/3426 @twalthr can you please take another look? Thanks.
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user twalthr commented on the issue:

          https://github.com/apache/flink/pull/3426

          Thanks for the update @haohui. I fixed the decimal issues by applying changes from @wuchong's PR. I think for now we should not do the primitive checking for scalar functions, this should be part of FLINK-5177. I will merge this now. But we should definitely solve FLINK-5177 soon. I will assign it to me.

          Show
          githubbot ASF GitHub Bot added a comment - Github user twalthr commented on the issue: https://github.com/apache/flink/pull/3426 Thanks for the update @haohui. I fixed the decimal issues by applying changes from @wuchong's PR. I think for now we should not do the primitive checking for scalar functions, this should be part of FLINK-5177 . I will merge this now. But we should definitely solve FLINK-5177 soon. I will assign it to me.
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user asfgit closed the pull request at:

          https://github.com/apache/flink/pull/3338

          Show
          githubbot ASF GitHub Bot added a comment - Github user asfgit closed the pull request at: https://github.com/apache/flink/pull/3338
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user asfgit closed the pull request at:

          https://github.com/apache/flink/pull/3426

          Show
          githubbot ASF GitHub Bot added a comment - Github user asfgit closed the pull request at: https://github.com/apache/flink/pull/3426
          Hide
          twalthr Timo Walther added a comment -

          Fixed in 1.3.0: bec818d84a65a812290d49bca9cfd62de7379b1e

          Show
          twalthr Timo Walther added a comment - Fixed in 1.3.0: bec818d84a65a812290d49bca9cfd62de7379b1e
          Hide
          githubbot ASF GitHub Bot added a comment -

          Github user docete commented on the issue:

          https://github.com/apache/flink/pull/3338

          @wuchong @haohui Could u help to check FLINK-6173 ?

          Show
          githubbot ASF GitHub Bot added a comment - Github user docete commented on the issue: https://github.com/apache/flink/pull/3338 @wuchong @haohui Could u help to check FLINK-6173 ?

            People

            • Assignee:
              jark Jark Wu
              Reporter:
              twalthr Timo Walther
            • Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development