Uploaded image for project: 'Pig'
  1. Pig
  2. PIG-4059 Pig on Spark
  3. PIG-4193

Make collected group work with Spark

    XMLWordPrintableJSON

    Details

    • Type: Sub-task
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: spark-branch
    • Component/s: spark
    • Labels:
      None

      Description

      Related e2e tests: CollectedGroup_1 - CollectedGroup_6

      Sample script:
      a = load '/user/pig/tests/data/singlefile/studenttab10k';
      b = order a by $0;
      store b into '/user/pig/out/praveenr-1411383735-nightly.conf/CollectedGroup_1.out.intermediate';
      exec;
      register ./lib/java/testudf.jar;
      c = load '/user/pig/out/praveenr-1411383735-nightly.conf/CollectedGroup_1.out.intermediate' using org.apache.pig.test.udf.storefunc.SimpleCollectableLoader();
      d = group c by $0 using 'collected';
      e = foreach d generate group, COUNT(c);
      store e into '/user/pig/out/praveenr-1411383735-nightly.conf/CollectedGroup_1.out';

      Pig Stack Trace
      ---------------
      ERROR 0: java.lang.IllegalArgumentException: Spork unsupported PhysicalOperator: (Name: d: Map side group [tuple]

      {bytearray} - scope-28 Operator Key: scope-28)

      org.apache.pig.backend.executionengine.ExecException: ERROR 0: java.lang.IllegalArgumentException: Spork unsupported PhysicalOperator: (Name: d: Map side group [tuple]{bytearray}

      - scope-28 Operator Key: scope-28)
      at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.launchPig(HExecutionEngine.java:285)
      at org.apache.pig.PigServer.launchPlan(PigServer.java:1378)
      at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1363)
      at org.apache.pig.PigServer.execute(PigServer.java:1352)
      at org.apache.pig.PigServer.executeBatch(PigServer.java:403)
      at org.apache.pig.PigServer.executeBatch(PigServer.java:386)
      at org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:170)
      at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:233)
      at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:204)
      at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:81)
      at org.apache.pig.Main.run(Main.java:611)
      at org.apache.pig.Main.main(Main.java:164)
      Caused by: java.lang.IllegalArgumentException: Spork unsupported PhysicalOperator: (Name: d: Map side group [tuple]

      {bytearray}

      - scope-28 Operator Key: scope-28)
      at org.apache.pig.backend.hadoop.executionengine.spark.SparkLauncher.physicalToRDD(SparkLauncher.java:239)
      at org.apache.pig.backend.hadoop.executionengine.spark.SparkLauncher.physicalToRDD(SparkLauncher.java:232)
      at org.apache.pig.backend.hadoop.executionengine.spark.SparkLauncher.physicalToRDD(SparkLauncher.java:232)
      at org.apache.pig.backend.hadoop.executionengine.spark.SparkLauncher.launchPig(SparkLauncher.java:140)
      at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.launchPig(HExecutionEngine.java:279)
      ... 11 more
      ================================================================================

        Attachments

        1. PIG-4193-1.patch
          10 kB
          Praveen Rachabattuni
        2. PIG-4193-2.patch
          23 kB
          Praveen Rachabattuni
        3. PIG-4193-3.patch
          36 kB
          Praveen Rachabattuni
        4. Screenshot 2015-03-12 13.39.23.png
          34 kB
          Mohit Sabharwal
        5. Screenshot 2015-03-12 13.39.23.png
          34 kB
          Praveen Rachabattuni

          Issue Links

            Activity

              People

              • Assignee:
                praveenr019 Praveen Rachabattuni
                Reporter:
                praveenr019 Praveen Rachabattuni
              • Votes:
                0 Vote for this issue
                Watchers:
                5 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: