cancel
Showing results for 
Search instead for 
Did you mean: 

Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.

memory config for loading data

Hello

I need help please

I am loading the ssb (star schema benchmark) data onto a neo4j database. For I am using the script "using period commit ...... load csv ...." from files I generated from the ssb database.

However, one of the files contains 23 million lines and it takes too long.

I have a 64GB ram virtual machine on ubuntu and I have configured my neo4j.conf as follows:

dbms.memory.heap.initial_size=40960m
dbms.memory.heap.max_size=40960m

dbms.memory.pagecache.size=20480m

I would like to know if you can give me an optimal configuration to speed up the loading.

thank you in advance

redha

Translated with DeepL

1 ACCEPTED SOLUTION

Hello @redha_benhisse1 😀

For large CSV files, you should use neo4j-admin import.

Moreover, do you have unique constraints on your nodes?

Finally, you can use neo4j-admin memrec to help you configure the database memory.

Regards,
Cobra

View solution in original post

2 REPLIES 2

Hello @redha_benhisse1 😀

For large CSV files, you should use neo4j-admin import.

Moreover, do you have unique constraints on your nodes?

Finally, you can use neo4j-admin memrec to help you configure the database memory.

Regards,
Cobra

Hi

neo4j-admin import resolve my problem

Thank u v m