You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
java.lang.IllegalStateException: The id field must be empty or null when id strategy is 'PRIMARY_KEY'for vertex label 'software'
at shaded.com.google.common.base.Preconditions.checkState(Preconditions.java:544)
at org.apache.hugegraph.util.E.checkState(E.java:64)
at org.apache.hugegraph.loader.builder.VertexBuilder.checkIdField(VertexBuilder.java:98)
at org.apache.hugegraph.loader.builder.VertexBuilder.<init>(VertexBuilder.java:46)
at org.apache.hugegraph.loader.spark.HugeGraphSparkLoader.initPartition(HugeGraphSparkLoader.java:201)
at org.apache.hugegraph.loader.spark.HugeGraphSparkLoader.lambda$null$18e75a97$1(HugeGraphSparkLoader.java:155)
at org.apache.spark.sql.Dataset.$anonfun$foreachPartition$2(Dataset.scala:2923)
at org.apache.spark.sql.Dataset.$anonfun$foreachPartition$2$adapted(Dataset.scala:2923)
at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2(RDD.scala:1020)
at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2$adapted(RDD.scala:1020)
at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2254)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:131)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1491)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
23/08/03 23:36:07 ERROR Executor: Exception in task 0.0 in stage 3.0 (TID 3)
java.lang.IllegalStateException: The id field must be empty or null when id strategy is 'PRIMARY_KEY'for vertex label 'person'
at shaded.com.google.common.base.Preconditions.checkState(Preconditions.java:544)
at org.apache.hugegraph.util.E.checkState(E.java:64)
at org.apache.hugegraph.loader.builder.VertexBuilder.checkIdField(VertexBuilder.java:98)
at org.apache.hugegraph.loader.builder.VertexBuilder.<init>(VertexBuilder.java:46)
at org.apache.hugegraph.loader.spark.HugeGraphSparkLoader.initPartition(HugeGraphSparkLoader.java:201)
at org.apache.hugegraph.loader.spark.HugeGraphSparkLoader.lambda$null$18e75a97$1(HugeGraphSparkLoader.java:155)
at org.apache.spark.sql.Dataset.$anonfun$foreachPartition$2(Dataset.scala:2923)
at org.apache.spark.sql.Dataset.$anonfun$foreachPartition$2$adapted(Dataset.scala:2923)
at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2(RDD.scala:1020)
at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2$adapted(RDD.scala:1020)
at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2254)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:131)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1491)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Bug Type (问题类型)
exception / error (异常报错)
The current Spark example doesn't work properly.
Before submit
Environment (环境信息)
Expected & Actual behavior (期望与实际表现)
Vertex/Edge example (问题点 / 边数据举例)
No response
Schema [VertexLabel, EdgeLabel, IndexLabel] (元数据结构)
from this file: https://github.com/apache/incubator-hugegraph-toolchain/blob/master/hugegraph-loader/assembly/static/example/spark/schema.groovy
exec by client
InputSource from this file: https://github.com/apache/incubator-hugegraph-toolchain/blob/master/hugegraph-loader/assembly/static/example/spark/struct.json
remove backendStoreInfo to use docker rocksdb
The text was updated successfully, but these errors were encountered: