Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.
08-06-2020 12:16 AM
Neo4j Server version: 4.0.0(community)
Neo4j Browser version: 4.0.1
Operating System: macOS Mojave 10.14.6
settings:
- NEO4J_dbms_memory_pagecache_size=2G
Hi Community,
I'm using the docker version of Neo4j server. Everything's fine, except that the server will crash and restart when long time query is executed.
Example of query include: MATCH (n) DETACH DELETE n
when nodes are plenty much, or when complicated search is applied.
Did anyone face this kind of problem? or is anything wrong with my settings?
Solved! Go to Solution.
08-06-2020 04:27 AM
Welcome to the community.
When you run a query like that it is run as a single transaction. So, you need enough heap memory to be able to complete that operation.
What is your system memory available?
If the system is crashing when you have a complex query it could be due to same reason, not enough heap memory available.
There are 3 options here.
For stability purposes we can add a query guard for memory.
By default the heap allocated to a query is unlimited.
Please take a look at these configurations
By setting dbms.memory.transaction.max_size you are limiting how much memory a single transaction can use. if it exceeds that size the query will be killed.
08-06-2020 04:27 AM
Welcome to the community.
When you run a query like that it is run as a single transaction. So, you need enough heap memory to be able to complete that operation.
What is your system memory available?
If the system is crashing when you have a complex query it could be due to same reason, not enough heap memory available.
There are 3 options here.
For stability purposes we can add a query guard for memory.
By default the heap allocated to a query is unlimited.
Please take a look at these configurations
By setting dbms.memory.transaction.max_size you are limiting how much memory a single transaction can use. if it exceeds that size the query will be killed.
08-08-2020 12:46 AM
Thanks for the answer!
I'm trying on my own laptop, with many other processes, so the memory left is only about 2G. Setting dbms.memory.transaction.max_size helps a lot.
All the sessions of the conference are now available online