cancel
Showing results for 
Search instead for 
Did you mean: 

Head's Up! These forums are read-only. All users and content have migrated. Please join us at community.neo4j.com.

Not able to Insert Neo4j Map Data type using the neo4j-spark connector

Spark 3.0.0
Scala 2.12
Neo4j 4.2.4

We have enabled apoc procedures in neo4j and getting below errror while saving the data using neo4j spark connector, We have listed schema and data example at the end of stack trace

org.neo4j.driver.exceptions.ClientException: Unable to convert org.apache.spark.unsafe.types.UTF8String to Neo4j Value.
at org.neo4j.driver.Values.value(Values.java:134)
at org.neo4j.driver.Values.value(Values.java:297)
at org.neo4j.driver.Values.value(Values.java:119)
at org.neo4j.spark.util.Neo4jUtil$.convertFromSpark(Neo4jUtil.scala:213)
at org.neo4j.spark.service.Neo4jWriteMappingStrategy.$anonfun$query$1(MappingService.scala:123)
at org.neo4j.spark.service.Neo4jWriteMappingStrategy.$anonfun$query$1$adapted(MappingService.scala:121)
at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:293)
at scala.collection.immutable.Range.foreach(Range.scala:158)
at scala.collection.TraversableLike.flatMap(TraversableLike.scala:293)
at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:290)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
at org.neo4j.spark.service.Neo4jWriteMappingStrategy.query(MappingService.scala:121)
at org.neo4j.spark.service.Neo4jWriteMappingStrategy.node(MappingService.scala:30)
at org.neo4j.spark.service.Neo4jWriteMappingStrategy.node(MappingService.scala:18)
at org.neo4j.spark.service.MappingService.convert(MappingService.scala:230)
at org.neo4j.spark.writer.BaseDataWriter.write(BaseDataWriter.scala:37)
at org.neo4j.spark.writer.Neo4jDataWriter.write(Neo4jDataWriter.scala:9)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.$anonfun$run$7(WriteToDataSourceV2Exec.scala:441)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2Exec.scala:477)
at org.apache.spark.sql.execution.datasources.v2.V2TableWriteExec.$anonfun$writeWithV2$2(WriteToDataSourceV2Exec.scala:385)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:127)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:444)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:447)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Wor

1 REPLY 1

Can someone Please help us on this