符号引用类不可访问:类 sun.util.calendar.ZoneInfo,来自接口 Spark.sql.catalyst.util.SparkDateTimeUtils

问题描述 投票:0回答:1

尝试使用 JDBC 驱动程序Spark (v4.0-preview1) 数据帧写入数据库表 (SQL Server)。出现以下错误。

java.lang.IllegalAccessException: symbolic reference class is not accessible: class sun.util.calendar.ZoneInfo, from interface org.apache.spark.sql.catalyst.util.SparkDateTimeUtils (unnamed module @7bbc8656)
    at java.base/java.lang.invoke.MemberName.makeAccessException(MemberName.java:955) ~[?:?]
    at java.base/java.lang.invoke.MethodHandles$Lookup.checkSymbolicClass(MethodHandles.java:3686) ~[?:?]
    at java.base/java.lang.invoke.MethodHandles$Lookup.resolveOrFail(MethodHandles.java:3646) ~[?:?]
    at java.base/java.lang.invoke.MethodHandles$Lookup.findVirtual(MethodHandles.java:2680) ~[?:?]
    at org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.org$apache$spark$sql$catalyst$util$SparkDateTimeUtils$$getOffsetsByWallHandle(SparkDateTimeUtils.scala:206) ~[spark-sql-api_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.org$apache$spark$sql$catalyst$util$SparkDateTimeUtils$$getOffsetsByWallHandle$(SparkDateTimeUtils.scala:201) ~[spark-sql-api_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.sql.catalyst.util.DateTimeUtils$.org$apache$spark$sql$catalyst$util$SparkDateTimeUtils$$getOffsetsByWallHandle$lzycompute(DateTimeUtils.scala:41) ~[spark-catalyst_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.sql.catalyst.util.DateTimeUtils$.org$apache$spark$sql$catalyst$util$SparkDateTimeUtils$$getOffsetsByWallHandle(DateTimeUtils.scala:41) ~[spark-catalyst_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.toJavaDate(SparkDateTimeUtils.scala:228) ~[spark-sql-api_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.toJavaDate$(SparkDateTimeUtils.scala:223) ~[spark-sql-api_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.sql.catalyst.util.DateTimeUtils$.toJavaDate(DateTimeUtils.scala:41) ~[spark-catalyst_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.sql.catalyst.util.DateTimeUtils.toJavaDate(DateTimeUtils.scala) ~[spark-catalyst_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificSafeProjection.createExternalRow_0_2$(Unknown Source) ~[?:?]
    at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificSafeProjection.apply(Unknown Source) ~[?:?]
    at scala.collection.Iterator$$anon$9.next(Iterator.scala:584) ~[scala-library-2.13.14.jar:?]
    at scala.collection.Iterator$$anon$9.next(Iterator.scala:584) ~[scala-library-2.13.14.jar:?]
    at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.savePartition(JdbcUtils.scala:806) ~[spark-sql_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$saveTable$1(JdbcUtils.scala:978) ~[spark-sql_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$saveTable$1$adapted(JdbcUtils.scala:977) ~[spark-sql_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2(RDD.scala:1042) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2$adapted(RDD.scala:1042) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2501) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:171) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.scheduler.Task.run(Task.scala:146) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$5(Executor.scala:640) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) ~[spark-common-utils_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) ~[spark-common-utils_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:99) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:643) [spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
    at java.base/java.lang.Thread.run(Thread.java:840) [?:?]

有人可以建议上述问题的解决方案吗!

java apache-spark
1个回答
0
投票

请按照此处的说明进行操作:https://stackoverflow.com/a/78300174/1028537

简而言之,您需要使用 Spark 本身中找到的选项(这些选项没有详细记录)才能在 jdk17 上运行:JavaModuleOptions

您也可以在构建文件中找到它:pom.xml test args

Spark connect 也具有此功能:spark-connect-scala-client:

-XX:+IgnoreUnrecognizedVMOptions \
  --add-opens=java.base/java.lang=ALL-UNNAMED \
  --add-opens=java.base/java.lang.invoke=ALL-UNNAMED \
  --add-opens=java.base/java.lang.reflect=ALL-UNNAMED \
  --add-opens=java.base/java.io=ALL-UNNAMED \
  --add-opens=java.base/java.net=ALL-UNNAMED \
  --add-opens=java.base/java.nio=ALL-UNNAMED \
  --add-opens=java.base/java.util=ALL-UNNAMED \
  --add-opens=java.base/java.util.concurrent=ALL-UNNAMED \
  --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED \
  --add-opens=java.base/jdk.internal.ref=ALL-UNNAMED \
  --add-opens=java.base/sun.nio.ch=ALL-UNNAMED \
  --add-opens=java.base/sun.nio.cs=ALL-UNNAMED \
  --add-opens=java.base/sun.security.action=ALL-UNNAMED \
  --add-opens=java.base/sun.util.calendar=ALL-UNNAMED \
  --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED \
  -Djdk.reflect.useDirectMethodHandle=false \
  -Dio.netty.tryReflectionSetAccessible=true
© www.soinside.com 2019 - 2024. All rights reserved.