使用Scala在Spark中创建DataFrame时出错

问题描述 投票:0回答:1

登录以下:请帮助我解决此问题

线程“主”中的异常java.lang.NoSuchMethodError:'scala.collection.GenTraversable scala.collection.mutable.Buffer $ .empty()'

at org.apache.spark.sql.SparkSessionExtensions.<init>(SparkSessionExtensions.scala:100)
at org.apache.spark.sql.SparkSession$Builder.<init>(SparkSession.scala:741)
at org.apache.spark.sql.SparkSession$.builder(SparkSession.scala:928)
at Dataframes.DataframeBasics$.delayedEndpoint$Dataframes$DataframeBasics$1(DataframeBasics.scala:13)
at Dataframes.DataframeBasics$delayedInit$body.apply(DataframeBasics.scala:5)
at scala.Function0.apply$mcV$sp(Function0.scala:39)
at scala.Function0.apply$mcV$sp$(Function0.scala:39)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
at scala.App.$anonfun$main$1(App.scala:73)
at scala.App.$anonfun$main$1$adapted(App.scala:73)
at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:553)
at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:551)
at scala.collection.AbstractIterable.foreach(Iterable.scala:921)
at scala.App.main(App.scala:73)
at scala.App.main$(App.scala:71)
at Dataframes.DataframeBasics$.main(DataframeBasics.scala:5)
at Dataframes.DataframeBasics.main(DataframeBasics.scala)
scala apache-spark bigdata
1个回答
0
投票

使用与编译Spark jar相同的Scala版本

例如如果您使用的是Scala 2.11.0,则最好使用spark-core_2.112.4.2等

还要检查您的Spark,Hadoop和AWS jar版本兼容性,这有点棘手。

  • 火花2.4.x
  • Hadoop 2.6.5
  • AWS Java SDK任何版本
© www.soinside.com 2019 - 2024. All rights reserved.