cancel
Showing results for 
Search instead for 
Did you mean: 

Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.

Spark Connector Run Cypher statement

learner
Node Link

I am able to perform spark.read and save into a dataframe. I am also able to write a dataframe as a node or relationship from  Spark to Neo4j. I would like to create a pipeline where I can run most of the creation and deletion work from Spark (without using Neo4j browser). However, I cannot use :`

spark.read....option("query",  "CREATE CONSTRAINT IF NOT EXISTS ON (a: APP) ASSERT (a.app_name) IS NODE KEY")

for example.  Is there a way I can run the above CONSTRAINT creation using spark connector on Neo4j without running the above in Neo4j browser?

1 ACCEPTED SOLUTION

conker84
Graph Voyager

Hi @learner you can create indexes/constraints only when you're ingesting data:

https://neo4j.com/docs/spark/current/writing/#_schema_optimization_operations

If you have to do before reading it's an error, it should be always upfront when you ingest it as it will provide constraints consistency.

Cheers

Andrea

View solution in original post

1 REPLY 1

conker84
Graph Voyager

Hi @learner you can create indexes/constraints only when you're ingesting data:

https://neo4j.com/docs/spark/current/writing/#_schema_optimization_operations

If you have to do before reading it's an error, it should be always upfront when you ingest it as it will provide constraints consistency.

Cheers

Andrea