Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.
10-14-2021 03:46 AM
Hi All,
I am using Neo4j Desktop Version. The hardware configuration of the system are as follows:
RAM: 8 GB
Storage: 128 GB
Tired loading 5.3 billion nodes into the system. But it was unsuccessful, the error throw was heap size exceeded. Again tried to do the same by increasing the maximum heap size to 6 GB. Still it was unsuccessful with the same error.
Also tried loading same 5.3 billion records as relationship between the nodes. The loading command was running since last one and half hour, hence terminated it and no relation was created.
It would be great if I get the answers to the following question regarding the performance:
Thanks & Regards,
Vinayak
10-14-2021 08:38 AM
Hi, @vinayak.bali !
I would recommend using apoc.periodic.iterate()
for loading your data in transactional batches and in parallel. By using it, the heap memory will be released in every batch.
An example of usage is:
CALL apoc.periodic.iterate(
'CALL apoc.load.jdbc("jdbc:mysql://localhost:3306/northwind?user=root","company")',
'CREATE (p:Person) SET p += value',
{ batchSize:10000, parallel:true})
RETURN batches, total
All the sessions of the conference are now available online