cancel
Showing results for 
Search instead for 
Did you mean: 

Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.

FastRP is giving different embeddings for the same graph

ed1
Node Link

We have a gds graph built with our key nodes and relationships in which we are looking to use for a deep learning algorithm. We found that using a FastRP ran graph embedding as a vector produced predictive power.

However, after running again, we realised that the FastRP neo4j implementation was giving us different values for embeddings on the same graph, without altering anything. This lack of consistency is worrying, is there any way to ensure that we get the same embeddings when we run it on the same data twice?

1 REPLY 1

FastRP has some stochastic elements in the internal calculations (such as calculating the similarity matrix). What you can do now is to write your embedding back to your database with .write mode, but in our next release (1.6) we have added a randomSeed parameter that will let you generate the same results across multiple runs. The pre-release is here, and the docs are here.