Uploaded image for project: 'Apache Hop (Retired)'
  1. Apache Hop (Retired)
  2. HOP-4188

BigQuery: Channel ManagedChannelImpl not shutdown properly

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Information Provided
    • None
    • Not applicable
    • Beam, BigQuery, Dataflow
    • None

    Description

      Some clean-up is apparently missing here. Happens on both Beam-Direct and Dataflow.

      Beam-Direct:

      SCHWERWIEGEND: *~*~*~ Channel ManagedChannelImpl{logId=11, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
          Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
      java.lang.RuntimeException: ManagedChannel allocation site
          at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
          at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
          at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
          at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:630)
          at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:297)
          at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:404)
          at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1900(InstantiatingGrpcChannelProvider.java:85)
          at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:246)
          at com.google.api.gax.grpc.ChannelPool.<init>(ChannelPool.java:149)
          at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:96)
          at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:256)
          at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:234)
          at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:201)
          at com.google.cloud.bigquery.storage.v1.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:132)
          at com.google.cloud.bigquery.storage.v1.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:144)
          at com.google.cloud.bigquery.storage.v1.BigQueryWriteClient.<init>(BigQueryWriteClient.java:130)
          at com.google.cloud.bigquery.storage.v1.BigQueryWriteClient.create(BigQueryWriteClient.java:111)
          at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1374)
          at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:145)
          at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:522)
          at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:467)
          at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:184)
          at org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase.extractFiles(BigQuerySourceBase.java:117)
          at org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase.split(BigQuerySourceBase.java:155)
          at org.apache.beam.runners.direct.BoundedReadEvaluatorFactory$InputProvider.getInitialInputs(BoundedReadEvaluatorFactory.java:216)
          at org.apache.beam.runners.direct.ReadEvaluatorFactory$InputProvider.getInitialInputs(ReadEvaluatorFactory.java:88)
          at org.apache.beam.runners.direct.RootProviderRegistry.getInitialInputs(RootProviderRegistry.java:80)
          at org.apache.beam.runners.direct.ExecutorServiceParallelExecutor.start(ExecutorServiceParallelExecutor.java:160)
          at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:213)
          at org.apache.hop.beam.engines.BeamPipelineEngine.executePipeline(BeamPipelineEngine.java:246)
          at org.apache.hop.beam.engines.BeamPipelineEngine.lambda$startThreads$0(BeamPipelineEngine.java:305)
          at java.base/java.lang.Thread.run(Thread.java:829)
      

      Dataflow:

      ERROR: *~*~*~ Channel ManagedChannelImpl{logId=3, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*     Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true. java.lang.RuntimeException: ManagedChannel allocation site         at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)         at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)         at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)         at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:630)         at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:297)         at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:404)         at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1900(InstantiatingGrpcChannelProvider.java:85)         at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:246)         at com.google.api.gax.grpc.ChannelPool.<init>(ChannelPool.java:149)         at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:96)         at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:256)         at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:234)         at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:201)         at com.google.cloud.bigquery.storage.v1.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:132)         at com.google.cloud.bigquery.storage.v1.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:144)         at com.google.cloud.bigquery.storage.v1.BigQueryWriteClient.<init>(BigQueryWriteClient.java:130)         at com.google.cloud.bigquery.storage.v1.BigQueryWriteClient.create(BigQueryWriteClient.java:111)         at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1374)         at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:145)         at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:522)         at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:467)         at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:184)         at org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase.extractFiles(BigQuerySourceBase.java:117)         at org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase.split(BigQuerySourceBase.java:155)         at org.apache.beam.runners.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:295)         at org.apache.beam.runners.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:217)         at org.apache.beam.runners.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:201)         at org.apache.beam.runners.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:180)         at org.apache.beam.runners.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:82)         at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:420)         at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:389)         at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:314)         at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)         at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)         at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)         at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)         at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)         at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)         at java.base/java.lang.Thread.run(Thread.java:834)
      

      Attachments

        Activity

          People

            Unassigned Unassigned
            fbarthez Fabian Peters
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: