cancel
Showing results for 
Search instead for 
Did you mean: 

Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.

Neo4j getting disconnected again and again

12kunal34
Graph Fellow

In neo4j , I am facing some strange problem. when i am performing some long running operatin like data importing then it is getting disconnected after some time then nothing helps only restart the server and i can again login into neo4j.
here i am just running batch import using jar and running same query again and again with diff data.

7 REPLIES 7

This could be a sign that you aren't actually batching the data, you might see out of memory issues or high GCs in the debug.log.

Can you provide the Cypher or code or pseudocode you're using for this?

Also please take a look at these batching tips and tricks.

Hi andrew i am using batch import
please find below query

UNWIND sss as row  MERGE (c:Entity{ID: row.id}) ON CREATE SET c.PROJECT_IDS=row.project_ids,c.HAS=row.has

Here sss is a list of data .
i am passing 2000 records at once and putting above query in a loop so data would insert again and again

i am getting below error after some time

 Exception in runTypeorg.neo4j.driver.v1.exceptions.TransientException: There is not enough memory to perform the current task. Please try increasing 'dbms.memory.heap.max_size' in the neo4j configuration (normally in 'conf/neo4j.conf' or, if you you are using Neo4j Desktop, found through the user interface) or if you are running an embedded installation increase the heap by using '-Xmx' command line flag, and then restart the database

i am giving 3GB as heap memory right now

2k per batch seems reasonable. Are you committing the transaction and creating a new transaction with each batch?

Also, do you have an index on :Entity(ID)?

Yes i do have index on entity id and i am putting above query in a for loop and each time loop will give 2k records to query . Since i have 200k data then it must run 100 times.

Just to double-check, can you run an EXPLAIN of the query and add the query plan (after expanding all elements?)

3GB of memory seems very low here. We typically recommend 8-16GB heap.

Have you tried reducing the batch size to 1k at a time?

Also, what version of Neo4j are you using?

I am using Neo4j Browser version: 3.2.10
Neo4j Server version: 3.4.9

and yes i tried with 1k at a time but same problem is there .......

Thanks, please add the EXPLAINed query plan.

Can you confirm again that you are explicitly committing and closing each transaction separately, and opening a new transaction with each new batch?