Details
-
New Feature
-
Status: Closed
-
Major
-
Resolution: Abandoned
-
0.7.0-incubating
-
None
Description
Flink should offer an easy way to group datasets and compute the sizes of the resulting groups. This is one of the most essential operations in distributed processing, yet it is very hard to implement in Flink.
I assume it could be a show-stopper for people trying Flink, because at the moment, users have to perform the grouping and then write a groupReduce that counts the tuples in the group and extracts the group key at the same time.
Here is what I would semantically expect to happen:
def groupCount[T, K](data: DataSet[T], extractKey: (T) => K): DataSet[(K, Long)] = { data.groupBy { extractKey } .reduceGroup { group => countBy(extractKey, group) } } private[this] def countBy[T, K](extractKey: T => K, group: Iterator[T]): (K, Long) = { val key = extractKey(group.next()) var count = 1L while (group.hasNext) { group.next() count += 1 } key -> count }