Uploaded image for project: 'Beam'
  1. Beam
  2. BEAM-7057

EmbeddedMetastoreService fails executing inserts

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Open
    • Priority: P3
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: io-java-hcatalog
    • Labels:
      None

      Description

      I am reusing an `EmbeddedMetastoreService` form hcatalog tests artifact for my own unit testing of Hive dependent code. When I am trying to insert values into the table via `executeQuery(..)` I see exception in logs

      java.lang.ClassCastException: org.apache.hadoop.hive.ql.optimizer.calcite.HiveTypeSystemImpl cannot be cast to org.apache.beam.repackaged.beam_sdks_java_extensions_sql.org.apache.calcite.rel.type.RelDataTypeSystem
      at org.apache.beam.repackaged.beam_sdks_java_extensions_sql.org.apache.calcite.jdbc.CalciteConnectionImpl.<init>(CalciteConnectionImpl.java:125)
      at org.apache.beam.repackaged.beam_sdks_java_extensions_sql.org.apache.calcite.jdbc.CalciteJdbc41Factory$CalciteJdbc41Connection.<init>(CalciteJdbc41Factory.java:115)
      at org.apache.beam.repackaged.beam_sdks_java_extensions_sql.org.apache.calcite.jdbc.CalciteJdbc41Factory.newConnection(CalciteJdbc41Factory.java:59)
      at org.apache.beam.repackaged.beam_sdks_java_extensions_sql.org.apache.calcite.jdbc.CalciteJdbc41Factory.newConnection(CalciteJdbc41Factory.java:44)
      at org.apache.beam.repackaged.beam_sdks_java_extensions_sql.org.apache.calcite.jdbc.CalciteFactory.newConnection(CalciteFactory.java:53)
      at org.apache.beam.repackaged.beam_sdks_java_extensions_sql.org.apache.calcite.avatica.UnregisteredDriver.connect(UnregisteredDriver.java:138)
      at java.sql.DriverManager.getConnection(DriverManager.java:664)
      at java.sql.DriverManager.getConnection(DriverManager.java:208)
      at org.apache.calcite.tools.Frameworks.withPrepare(Frameworks.java:140)
      at org.apache.calcite.tools.Frameworks.withPlanner(Frameworks.java:105)
      at org.apache.hadoop.hive.ql.parse.CalcitePlanner.getOptimizedAST(CalcitePlanner.java:609)
      at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:246)
      at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10133)
      at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:209)
      at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:227)
      at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:424)
      ...

      I realise this component is not an API but wanted to ask / point this out. Internally, hcatalog test code is inserting data via DataTransferFactory and not via executeQuery(). 

      Could this exception be fixed somehow?

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              JozoVilcek Jozef Vilcek
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated: