问题描述:
我们尝试在以下服务器上启动 Apache Zeppelin 0.11.1。 我们按照 Apache Zepplin 官方网站上的安装说明进行操作。
Apache Zeppelin 可以启动并打开笔记本。 执行段落时,会出现附加的错误消息。
我们首先尝试使用 Docker 来实现整个事情。 然后我们还测试了二进制版本。
但是,两种变体都会出现相同的错误。
系统说明:
Ubuntu 20.04.6 LTS(GNU/Linux 5.15.0-1063-aws x86_46)
Java:1.8.0_412(OpenJDK 运行时环境 - 版本 1.8.0_412-8u412-ga-1~20.04.1-b08)
Scala:2.11.12(OpenJDK 64 位服务器虚拟机,Java1.8.0_412)
Spark:3.5.1(Spark Hadoop3)
Zeppelin:0.11.1 来自 (https://zeppelin.apache.org/docs/latest/quickstart/install.html)
存储库 | 标签 | 图像ID | 已创建 | 尺寸 |
---|---|---|---|---|
阿帕奇/齐柏林飞艇 | 0.11.1 | f6b9613cfb44 | 2个月前 | 8.44GB |
docker run -p 8080:8080 --rm
-v /projects/zeppelin/lib:/opt/zeppelin/.m2/repository
-v /projects/zeppelin/logs:/logs
-v /projects/zeppelin/notebook:/notebook
-v /projects/zeppelin/db:/db
-v /projects/zeppelin/lib/spark-current/spark-3.5.1-bin-hadoop3:/opt/spark
-e ZEPPELIN_LOG_DIR='/logs'
-e ZEPPELIN_NOTEBOOK_DIR='/notebook'
-e ZEPPELIN_JAVA_OPTS="-Dspark.executor.memory=16g -Dspark.cores.max=8 -Dspark.io.compression.codec=snappy"
-e ZEPPELIN_INTP_MEM="-Xmx16g"
-e SPARK_HOME=/opt/spark
--name zeppelin apache/zeppelin
堆栈跟踪:
org.apache.zeppelin.interpreter.InterpreterException: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:861)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:769)
at org.apache.zeppelin.scheduler.Job.run(Job.java:186)
at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:135)
at org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter
at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:140)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
... 8 more
Caused by: scala.reflect.internal.FatalError: Error accessing /projects/zeppelin-v2/zeppelin-0.11.1-bin-all/interpreter/spark/._spark-interpreter-0.11.1.jar
at scala.tools.nsc.classpath.AggregateClassPath.$anonfun$list$3(AggregateClassPath.scala:113)
at scala.collection.Iterator.foreach(Iterator.scala:943)
at scala.collection.Iterator.foreach$(Iterator.scala:943)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
at scala.collection.IterableLike.foreach(IterableLike.scala:74)
at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
at scala.tools.nsc.classpath.AggregateClassPath.list(AggregateClassPath.scala:101)
at scala.tools.nsc.util.ClassPath.list(ClassPath.scala:36)
at scala.tools.nsc.util.ClassPath.list$(ClassPath.scala:36)
at scala.tools.nsc.classpath.AggregateClassPath.list(AggregateClassPath.scala:30)
at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader.doComplete(SymbolLoaders.scala:298)
at scala.tools.nsc.symtab.SymbolLoaders$SymbolLoader.complete(SymbolLoaders.scala:250)
at scala.reflect.internal.Symbols$Symbol.completeInfo(Symbols.scala:1542)
at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1514)
at scala.reflect.internal.Mirrors$RootsBase.init(Mirrors.scala:258)
at scala.tools.nsc.Global.rootMirror$lzycompute(Global.scala:74)
at scala.tools.nsc.Global.rootMirror(Global.scala:72)
at scala.tools.nsc.Global.rootMirror(Global.scala:44)
at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:301)
at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:301)
at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1511)
at scala.tools.nsc.Global$Run.<init>(Global.scala:1213)
at scala.tools.nsc.interpreter.IMain._initialize(IMain.scala:124)
at scala.tools.nsc.interpreter.IMain.initializeSynchronous(IMain.scala:146)
at org.apache.zeppelin.spark.SparkScala212Interpreter.createSparkILoop(SparkScala212Interpreter.scala:195)
at org.apache.zeppelin.spark.AbstractSparkScalaInterpreter.open(AbstractSparkScalaInterpreter.java:116)
at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:124)
... 9 more
Caused by: java.io.IOException: Error accessing /projects/zeppelin-v2/zeppelin-0.11.1-bin-all/interpreter/spark/._spark-interpreter-0.11.1.jar
at scala.reflect.io.FileZipArchive.scala$reflect$io$FileZipArchive$$openZipFile(ZipArchive.scala:190)
at scala.reflect.io.FileZipArchive.root$lzycompute(ZipArchive.scala:238)
at scala.reflect.io.FileZipArchive.root(ZipArchive.scala:235)
at scala.reflect.io.FileZipArchive.allDirs$lzycompute(ZipArchive.scala:272)
at scala.reflect.io.FileZipArchive.allDirs(ZipArchive.scala:272)
at scala.tools.nsc.classpath.ZipArchiveFileLookup.findDirEntry(ZipArchiveFileLookup.scala:76)
at scala.tools.nsc.classpath.ZipArchiveFileLookup.list(ZipArchiveFileLookup.scala:63)
at scala.tools.nsc.classpath.ZipArchiveFileLookup.list$(ZipArchiveFileLookup.scala:62)
at scala.tools.nsc.classpath.ZipAndJarClassPathFactory$ZipArchiveClassPath.list(ZipAndJarFileLookupFactory.scala:58)
at scala.tools.nsc.classpath.AggregateClassPath.$anonfun$list$3(AggregateClassPath.scala:105)
... 36 more
Caused by: java.util.zip.ZipException: error in opening zip file
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:231)
at java.util.zip.ZipFile.<init>(ZipFile.java:157)
at java.util.zip.ZipFile.<init>(ZipFile.java:171)
at scala.reflect.io.FileZipArchive.scala$reflect$io$FileZipArchive$$openZipFile(ZipArchive.scala:187)
... 45 more