cancel
Showing results for 
Search instead for 
Did you mean: 

Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.

Neo4j Kafka Source Plugin: Re- publish everything without losing data

Hi. I am using Neo4j 4.2.3 together with the neo4j-streams-4.0.8 Plugin to stream Neo4j CDC events to a kafka topic. Works okay so far.

Here is the question: It is quite possible that in some cases I lose everything what's in the topic (due to non-Kafka or non-Neo4j related events). In this case I need to get all the data of Neo4j back into the now empty topic. Is it possible to do it without altering all the nodes/relationships?

Thank you!

1 REPLY 1

neo4j-streams has a stored procedure CALL streams.publish(topic, message). You can see the docs here:

You could combine this with APOC to re-publish everything in your database if you wished, something like:

CALL apoc.periodic.iterate('MATCH (p:Person) RETURN p',  'CALL streams.publish("my-topic", { name: p.name }), { parallel: false });

This is an answer to your question, but honestly -- it is also possible to configure durability on the kafka topic itself. So if kafka is correctly configured, I would hope this situation would not arise.

Nodes 2022
Nodes
NODES 2022, Neo4j Online Education Summit

All the sessions of the conference are now available online