线程“main”中的异常java.lang.NoClassDefFoundError:org / apache / spark / streaming / StreamingContext

问题描述 投票:2回答:1

大家好,看起来像下面的代码中找不到类StreamingContext。

import org.apache.spark.streaming.{Seconds, StreamingContext}
import org.apache.spark.{SparkConf, SparkContext}
object Exemple {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setMaster("local[*]").setAppName("Exemple")
    val sc = new SparkContext(conf)
    val ssc = new StreamingContext(sc, Seconds(2)) //this line throws error

  }
}

这是错误:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/StreamingContext
    at Exemple$.main(Exemple.scala:16)
    at Exemple.main(Exemple.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.StreamingContext
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 2 more

Process finished with exit code 1

我使用以下build.sbt文件:

name := "exemple"

version := "1.0.0"

scalaVersion := "2.11.11"

// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.2.0" % "provided"
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0"

我使用intellij Run按钮运行Exemple类,我得到了错误。在sbt shell中它工作正常。进入我的dependecies'module,我可以找到火花依赖。代码在intellij中编译。我可以在外部库中看到spark依赖(在左侧项目面板内)。你有什么主意吗。看起来并不复杂。

enter image description here

scala apache-spark intellij-idea sbt spark-streaming
1个回答
5
投票

请从spark-streaming库中删除。

libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.2.0" 

在更改之后,仍然存在依赖性问题,排除重复的jar。

 "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0" excludeAll(
      ExclusionRule(organization = "org.spark-project.spark", name = "unused"),
      ExclusionRule(organization = "org.apache.spark", name = "spark-streaming"),
      ExclusionRule(organization = "org.apache.hadoop")
    ),

希望这可以帮助。

谢谢拉维

© www.soinside.com 2019 - 2024. All rights reserved.