cancel
Showing results for 
Search instead for 
Did you mean: 

Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.

Neo4j Tried to read a field larger than buffer size 2097152

skmn_88
Node Link

neo4j Tried to read a field larger than buffer size 2097152. A common cause of this is that a field has an unterminated quote and so will try to seek until the next quote, which ever line it may be on. This should not happen if multi-line fields are disabled, given that the fields contains no new-line characters

4 REPLIES 4

Hi @skmn.88 ,

Could you provide some more context about what you are trying to do?

Best,
ABK

skmn_88
Node Link

I am trying to import the data from csv file and im facing the above error.

I increased the dbms.import.csv.buffer_size to 209715200. But error still keeps on coming

Hi,

I got the same error like the following when i tried to import csv as node through py2neo.

py2neo.errors.DatabaseError: [Statement.ExecutionFailed] Tried to read a field larger than buffer size 2097152. A common cause of this is that a field has an unterminated quote and so will try to seek until the next quote, which ever line it may be on. This should not happen if multi-line fields are disabled, given that the fields contains no new-line characters. 

The cypher that i used is ¨

USING PERIODIC COMMIT
LOAD CSV WITH HEADERS FROM 'file:///file.csv' AS row
CREATE (: labelName{ id: toInteger(row.id), ....});

Regards,
yhchen

skmn_88
Node Link

The csv file was too huge and i was getting the above error. I splitted the csv file into 3 and import was successful without any issues... Hope that will work for that for u is well

Nodes 2022
Nodes
NODES 2022, Neo4j Online Education Summit

All the sessions of the conference are now available online