Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
1.12.1
Description
I wrote the following program according to the example code provided in Documentation/Table API/Row-based operations
public class TableUDF { public static void main(String[] args) { StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironment(); StreamTableEnvironment tEnv = StreamTableEnvironment.create(env); Table input = tEnv.fromValues( DataTypes.of("ROW<c STRING>"), Row.of("name") ); ScalarFunction func = new MyMapFunction(); tEnv.registerFunction("func", func); Table table = input .map(call("func", $("c")).as("a", "b")); // exception occurs here table.execute().print(); } public static class MyMapFunction extends ScalarFunction { public Row eval(String a) { return Row.of(a, "pre-" + a); } @Override public TypeInformation<?> getResultType(Class<?>[] signature) { return Types.ROW(Types.STRING, Types.STRING); } } }
The code above would throw an exception like this:
Exception in thread "main" org.apache.flink.table.api.ValidationException: Only a scalar function can be used in the map operator. at org.apache.flink.table.operations.utils.OperationTreeBuilder.map(OperationTreeBuilder.java:480) at org.apache.flink.table.api.internal.TableImpl.map(TableImpl.java:519) at org.apache.flink.ml.common.function.TableUDFBug.main(TableUDF.java:29)
The core of the program above is identical to that provided in flink documentation, but it cannot function correctly. This might affect users who want to use custom function with table API.
Attachments
Issue Links
- links to