cancel
Showing results for 
Search instead for 
Did you mean: 

Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.

Apply a pretrained fastRP for new samples

ybm11
Node

Hello,

I am new to neo4j and have went through the documentation for applying a node classification on some data of mine, much like in the following tutorial .

In my case, there will be an ongoing stream of new samples which will be represented as new nodes in the graph, and I'll need to classify them. Therefore, it is unfeasible for me to train new FastRP embeddings for each new sample.

My question is whether it's possible to apply pre-trained fastRP weights on a new sample.

According to the fastrp documentation seems like it's legit (under the assumption that new samples originate from the same distribution, which is valid in my case), but I couldn't find an example of how to do this, neither in neo4j documentation nor in external articles. Below is the relevant citation from FastRP documentation:

In order for a machine learning model to be able to make useful predictions, it is important that features produced during prediction are of a similar distribution to the features produced during training of the model. Moreover, node property steps (whether FastRP or not) added to a pipeline are executed both during training, and during the prediction by the trained model. It is therefore problematic when a pipeline contains an embedding step which yields all too dissimilar embeddings during training and prediction.

1 REPLY 1

Hello,

Is there any feedback regarding this matter? I plan to use FastRP in production, yet it seems a bit unfeasible to have to retrain the FastRP embeddings from scratch for every new sample.

Nodes 2022
Nodes
NODES 2022, Neo4j Online Education Summit

All the sessions of the conference are now available online