cancel
Showing results for 
Search instead for 
Did you mean: 

Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.

"java.lang.OutOfMemoryError: Java heap space" when running "ga.nlp.annotate" using GraphAware NLP

doug
Node Clone

Windows 10
32Gb RAM
8 core Xeon processor at 3.4GHz
Neo4j 3.4.7
Neo4j Browser 3.2.13
apoc-3.4.0.3.jar
graphaware-nlp-3.4.7.52.13.jar
graphaware-server-community-all-3.4.7.52.jar
nlp-stanfordnlp-3.4.7.52.13.jar
stanford-english-corenlp-2018-10-05-models.jar

Hi. I am trying to annotate all the text fields in my database. There are 25532 nodes with text values.

I'm using the following query to do this:

    CALL apoc.periodic.iterate(
    "MATCH (n:FreeTextResponse) WHERE NOT (n)-[:HAS_ANNOTATED_TEXT]->() RETURN n",
    "CALL ga.nlp.annotate({text: n.fullSentenceString, id: id(n), checkLanguage: false})
    YIELD result MERGE (n)-[:HAS_ANNOTATED_TEXT]->(result)", {batchSize:1, iterateList:false})

...and am getting the following error:

java.lang.OutOfMemoryError: Java heap space

I'm sure this is just a settings change somewhere, but I'm not sure what or where. Sorry if this is a bit of a newbie question!

Any ideas please?

Thank you!

3 REPLIES 3

Have a look at this page, it'll tell you what you need to fix this:

https://neo4j.com/docs/operations-manual/current/performance/memory-configuration/

Essentially, the heap is the amount of working memory you allocate to Neo4j. If you're running inside of Neo4j Desktop, you probably have a fairly low amount of memory allocated, and the error just means you're trying to do something too big to fit into your memory. So in general you should increase the size of your heap you allow to Neo4j. You can do that with the configuration settings on the page above.

The default configuration is 512MB and it is not enough for the models used by Stanford NLP.
As suggested here:

Change your neo4j.conf file in the following way:

dbms.memory.heap.initial_size=3000m
dbms.memory.heap.max_size=5000m

Although considering your RAM availability I would suggest 5GB for both values.

doug
Node Clone

Thank you both very much. Strangely I had tried this, but the settings had reverted to the default ones for some reason, so I assumed that it was the incorrect approach. However, I upped the dbms.memory.heap.max_size to 5Gb this morning and it seems to have worked! So thank you. I'm now going to run the enrichment and see if that also works. Thanks very very much!

Nodes 2022
Nodes
NODES 2022, Neo4j Online Education Summit

All the sessions of the conference are now available online