从 Spark 向 python 提供内容根路径后,我收到此错误

问题描述 投票:0回答:1
C:\Users\nande\PycharmProjects\Big_Data_Project_USA_Pharma\venv\Scripts\python.exe C:\Users\nande\PycharmProjects\Big_Data_Project_USA_Pharma\driver.py 
Traceback (most recent call last):
  File "C:\Users\nande\PycharmProjects\Big_Data_Project_USA_Pharma\driver.py", line 2, in <module>
    from create_spark import get_spark_object
  File "C:\Users\nande\PycharmProjects\Big_Data_Project_USA_Pharma\create_spark.py", line 1, in <module>
    from pyspark.sql import SparkSession
  File "C:\spark\python\pyspark\__init__.py", line 51, in <module>
    from pyspark.context import SparkContext
  File "C:\spark\python\pyspark\context.py", line 31, in <module>
    from pyspark import accumulators
  File "C:\spark\python\pyspark\accumulators.py", line 97, in <module>
    from pyspark.serializers import read_int, PickleSerializer
  File "C:\spark\python\pyspark\serializers.py", line 71, in <module>
    from pyspark import cloudpickle
  File "C:\spark\python\pyspark\cloudpickle.py", line 145, in <module>
    _cell_set_template_code = _make_cell_set_template_code()
                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\spark\python\pyspark\cloudpickle.py", line 126, in _make_cell_set_template_code
    return types.CodeType(
           ^^^^^^^^^^^^^^^
TypeError: code expected at least 16 arguments, got 15

Process finished with exit code 1

java版本:1.8.0_251 python版本:Python 3.11.5 Pycharm 中的 pyspark:3.4.1

在进行项目设置时,我已授予对内容根目录的访问权限:C:/spark/python 提供此内容根后,我开始遇到此问题 我们如何解决这个问题?

python apache-spark pyspark pycharm
1个回答
0
投票

降级到spark 3.3.1后问题得到解决。

© www.soinside.com 2019 - 2024. All rights reserved.