Uploaded image for project: 'Beam'
  1. Beam
  2. BEAM-13777

confluent schema registry cache capacity

Details

    • Bug
    • Status: Resolved
    • P2
    • Resolution: Fixed
    • None
    • 2.37.0
    • sdk-java-core
    • None

    Description

      Change cache capacity should be specified as input parameter instead of default max integer. The usage can be quite different case by case and a default Integer max value can lead to error like this depending on the setup:
      Exception in thread "main" java.lang.OutOfMemoryError: Java heap space

      Some documentation link on the parameter: https://docs.confluent.io/5.4.2/clients/confluent-kafka-dotnet/api/Confluent.SchemaRegistry.CachedSchemaRegistryClient.html#Confluent_SchemaRegistry_CachedSchemaRegistryClient_DefaultMaxCachedSchemas

      Attachments

        Issue Links

          Activity

            People

              aghajani Mostafa Aghajani
              aghajani Mostafa Aghajani
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved:

                Time Tracking

                  Estimated:
                  Original Estimate - Not Specified
                  Not Specified
                  Remaining:
                  Remaining Estimate - 0h
                  0h
                  Logged:
                  Time Spent - 0.5h
                  0.5h