cancel
Showing results for 
Search instead for 
Did you mean: 

Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.

Neo4j Thread Pool Configuration Recommendations

shweta
Node Link

Hi,

We are using Neo4j 3.4.1 Enterprise edition.

We have recently been facing > 90% CPU usage during high load. This is very problematic as all our queries are taking longer to execute.

Configuration

Our DB size is ~2T. Our machine is r4.4xlarge. ( 16cores, 122 GB ram. )

We have < 100 active user but we run heavy batch write operations.

Our thread pool configuration is -
min size - 600
max size - 3000
keepalive - 10min

Sysinfo

It is hard for us to conclude what is causing this high CPU usage and how we can fix it.

We have already optimized our queries. And we cant find hardware recommendations for our graph size using the calculator.

These are our questions ->

  1. Will upgrading the machine help? Does adding more cores and RAM improve Neo4j performance.? (specifically changing to r4.8xlarge machine - 32 cores and 256gb ram)
  2. Is there a limit to the maximum pool size? What are the cons if I increase the pool size from 3000 to 6000?
  3. Are there any other recommendations to tackle the high CPU usage?

Cheers
Shweta

4 REPLIES 4

shweta
Node Link

Hi,

Any recommendations on the above?

Quite stuck at the moment while figuring out the next steps.

Cheers
Shweta

Hi Shweta,

It depends a bit on the queries you run. Can you see how the server storage I/O is doing on the peaks in the CPU?

Hi Tim,

This is the graph of the transactions read and write during that time frame where we saw high CPU.
2X_4_46a60a53f62eed415322588cd11f30c8951402b0.png

And this is the graph of the page faults.

2X_a_a3113c9b35161b673ddde7f5df69407aa4571731.png

What are your thoughts?

The data that we fetch is not repetitive in nature. As in we dont fetch the same data again and again. So increasing the page cache size might not be the best solution for us.
Although I am not too sure.

Thanks
Shweta

Hey guys,

Any inputs on the above?

Cheers
Shweta