Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.
01-21-2022 02:14 AM
Hi,
I am training the GraphSage
model for embeddings with 64 dim like below
CALL gds.beta.graphSage.train(
'products',
{
modelName: 'productsGraphSage',
nodeLabels: ['Group', 'Group1'],
relationshipTypes: ['HAS_GROUP1'],
featureProperties: ['group1_count', 'length'],
embeddingDimension: 64,
projectedFeatureDimension: 2,
epochs: 10
}
)
The model converged but generated embeddings are quite close to each other. Generated embeddings are always in this range
[0.0125671e-16, 0.0125672e-16, .......]
I am doing binary node classification by taking embeddings as input. The classification result gives me accuracy near to 50% only.
However, If I generate embeddings with FastRP
algo, generated embeddings are quite good. It varies from +1 to -1 and my node classification algorithm also gives me 95% accuracy.
Seems like something is missing in GraphSage
algo. Can someone help me to identify the actual cause of this issue?
DB Version: 4.4.3
GDS Version: 1.8.2
Neo4j Desktop: 1.4.12
OS: Windows 11
Solved! Go to Solution.
03-10-2022 09:19 PM
Currently, GraphSage is not that much fast and accurate. Maybe in the coming years Neo4j
will work on it.
03-10-2022 09:19 PM
Currently, GraphSage is not that much fast and accurate. Maybe in the coming years Neo4j
will work on it.
All the sessions of the conference are now available online