Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.
04-25-2019 11:00 PM
USING PERIODIC COMMIT
LOAD CSV WITH HEADERS from "url" AS line
merge(p:t1 {pid: line.pid})
merge(a:t2 {aid:line.aid})
create(p)-[:ASSIGNED_TO]->(a)
Note: Nodes are indexed on pid and aid, relationships are not indexed
file I am importing
a 20mb csv containing one column of pid and one column of aid (1 million rows)
config
dbms.security.procedures.unrestricted=apoc.trigger.*,apoc.meta.*,algo.*
dbms.security.procedures.whitelist=apoc.coll.*,apoc.load.*,algo.*
dbms.directories.plugins=/plugins
dbms.directories.import=import
dbms.memory.heap.initial_size=512M
dbms.memory.heap.max_size=512M
dbms.memory.pagecache.size=512M
dbms.connectors.default_listen_address=0.0.0.0
dbms.connectors.default_advertised_address=9.0.21.130
dbms.connector.bolt.enabled=true
#dbms.connector.bolt.tls_level=OPTIONAL
dbms.connector.bolt.listen_address=0.0.0.0:7687
dbms.connector.http.enabled=true
dbms.connector.http.listen_address=0.0.0.0:7474
dbms.connector.https.enabled=true
dbms.connector.https.listen_address=0.0.0.0:7473
dbms.mode=SINGLE
causal_clustering.expected_core_cluster_size=1
causal_clustering.initial_discovery_members=9.0.21.130:5000
causal_clustering.raft_advertised_address=9.0.21.130:7000
causal_clustering.transaction_advertised_address=9.0.21.130:6000
dbms.mode=SINGLE
dbms.tx_log.rotation.retention_policy=100M size
dbms.jvm.additional=-XX:+UseG1GC
dbms.jvm.additional=-XX:-OmitStackTraceInFastThrow
dbms.jvm.additional=-XX:+AlwaysPreTouch
dbms.jvm.additional=-XX:+UnlockExperimentalVMOptions
dbms.jvm.additional=-XX:+TrustFinalNonStaticFields
dbms.jvm.additional=-XX:+DisableExplicitGC
dbms.jvm.additional=-Djdk.tls.ephemeralDHKeySize=2048
dbms.windows_service_name=neo4j
dbms.jvm.additional=-Dunsupported.dbms.udc.source=tarball
wrapper.java.additional=-Dneo4j.ext.udc.source=docker
causal_clustering.discovery_advertised_address=9.0.21.130:5000
The problem I am facing is that my server crashes (and apparently doesn't recover) when I try to import 200K records. It works when I import 100K rows. The error I get is JAVA HEAP error
04-26-2019 12:15 AM
Can you add the query plan from an EXPLAIN of your query (please expand all elements of the plan)?
All the sessions of the conference are now available online