Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Won't Fix
-
None
-
None
-
None
Description
Hello,
I am experiencing performance issue with CouchDB.
Use Case: I am working on a process that retrieves the data from RDBMS and process them into JSON document and POST them to the CouchDB.
I am trying to POST around half a million documents, most of them in batches (_bulk_doc) of 10,000 and have tried with batch of 5,000, 15,000, and 20,000.
Whole process takes around 90-100 minutes.
During the life of the process, Memory Consumption by CouchDB keeps on growing and memory is not released when CouchDB has finished working.
So if the memory consumption by CouchDB was 60% at the time process finishes, memory consumption will remain 60% and not reducing.
Subsequently, when the process starts running again. memory consumption is Maxed out and CouchDB restarts itself. This restart fails the process that I am running. Looking at the Syslogs , I see Out Of Memory Error by the CouchDB process and killing statement.
The CouchDb process that has the issue is the "beam.smp" of Erlang.
At this point, I have tried upgrading the memory of the server to see if this resolves the issue, unfortunately, the issue persists. Memory Leak is there and Usage keeps on growing until CouchDB restarts/crashed.
I also have tried running garbage collection from Erlang command (erlang:garbage_collect().) line but it didn't do anything.
At this point, I am out of ideas and not sure what is going on here. Any input/suggestion is highly appreciated!
Env:
Platform: Linux (Red Hat release 6.4 (Santiago))
CouchDB: 1.3 and have tried with 1.5 as well
RAM: Tried with 2G, 4G, and 8G
CPU: 2 cores
Process:/usr/lib64/erlang/erts-5.8.5/bin/beam.smp -Bd -K true -A 4 – -root /usr/lib64/erlang