Cloud Dataproc无法访问Cloud Storage存储桶

问题描述 投票:0回答:1

我有一个cloud dataproc Spark作业,该作业也使用Drvier方面的Cloud Strage API(从同一文件夹中选择特定文件以供使用。

这里是Maven依赖项:

<dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.12</artifactId>
            <version>2.4.4</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>com.google.cloud</groupId>
            <artifactId>google-cloud-storage</artifactId>
            <version>1.101.0</version>
        </dependency>
    </dependencies>

这是失败的最简单的代码版本:

import com.google.cloud.storage._

object Test {
  def main(args: Array[String]): Unit = {
    val storage = StorageOptions.getDefaultInstance().getService()
--> storage.list("intent_raw")
  }
}

这里是堆栈跟踪:

Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
    at com.google.api.gax.retrying.BasicRetryingFuture.<init>(BasicRetryingFuture.java:84)
    at com.google.api.gax.retrying.DirectRetryingExecutor.createFuture(DirectRetryingExecutor.java:88)
    at com.google.api.gax.retrying.DirectRetryingExecutor.createFuture(DirectRetryingExecutor.java:74)
    at com.google.cloud.RetryHelper.run(RetryHelper.java:75)
    at com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:50)
    at com.google.cloud.storage.StorageImpl.listBlobs(StorageImpl.java:372)
    at com.google.cloud.storage.StorageImpl.list(StorageImpl.java:328)
--> at ai.mandal.cloud.dataproc.Test$.main(Test.scala:14)
    at ai.mandal.cloud.dataproc.Test.main(Test.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我的问题通常是什么原因造成的,并且如果我是通过dataproc服务(可以访问存储桶的)运行它,是否需要为此配置单独的凭据。

google-cloud-storage google-cloud-dataproc
1个回答
0
投票

解决方案是添加

spark.executor.userClassPathFirst = true
spark.driver.userClassPathFirst = true

到工作属性。

此问题是由google-cloud-storage和主机环境中发现的番石榴版本冲突引起的。

Google建议在您的依赖项中为冲突的番石榴着色,我也尝试过这样做,但这在这种情况下不起作用。

© www.soinside.com 2019 - 2024. All rights reserved.