PyCharm在用作解释器的docker容器中覆盖PYTHONPATH

问题描述 投票:6回答:1

我有一个包含各种位的docker镜像,包括Spark。这是我的Dockerfile:

FROM docker-dev.artifactory.company.com/centos:7.3.1611

# set proxy
ENV http_proxy http://proxyaddr.co.uk:8080
ENV HTTPS_PROXY http://proxyaddr.co.uk:8080
ENV https_proxy http://proxyaddr.co.uk:8080

RUN yum install -y epel-release
RUN yum install -y gcc
RUN yum install -y krb5-devel
RUN yum install -y python-devel
RUN yum install -y krb5-workstation
RUN yum install -y python-setuptools
RUN yum install -y python-pip
RUN yum install -y xmlstarlet
RUN yum install -y wget java-1.8.0-openjdk
RUN pip install kerberos
RUN pip install numpy
RUN pip install pandas
RUN pip install coverage
RUN pip install tensorflow
RUN wget http://d3kbcqa49mib13.cloudfront.net/spark-1.6.0-bin-hadoop2.6.tgz
RUN tar -xvzf spark-1.6.0-bin-hadoop2.6.tgz -C /opt
RUN ln -s spark-1.6.0-bin-hadoop2.6 /opt/spark


ENV VERSION_NUMBER $(cat VERSION)
ENV JAVA_HOME /etc/alternatives/jre/
ENV SPARK_HOME /opt/spark
ENV PYTHONPATH $SPARK_HOME/python/:$PYTHONPATH
ENV PYTHONPATH $SPARK_HOME/python/lib/py4j-0.9-src.zip:$PYTHONPATH

我可以构建然后运行该docker镜像,连接到它,并成功导入pyspark库:

$ docker run -d -it sse_spark_build:1.0
09e8aac622d7500e147a6e6db69f806fe093b0399b98605c5da2ff5e0feca07c
$ docker exec -it 09e8aac622d7 python
Python 2.7.5 (default, Nov  6 2016, 00:28:07)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-11)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> from pyspark import SparkContext
>>>import os
>>> os.environ['PYTHONPATH']
'/opt/spark/python/lib/py4j-0.9-src.zip:/opt/spark/python/:'
>>>

注意PYTHONPATH的价值!

问题是如果我使用相同的docker镜像作为解释器,PyCharm中的行为是不同的。以下是我设置解释器的方法:

python interpreter setup

如果我然后在PyCharm中运行Python控制台,则会发生以下情况:

bec0b9189066:python /opt/.pycharm_helpers/pydev/pydevconsole.py 0 0
PyDev console: starting.
import sys; print('Python %s on %s' % (sys.version, sys.platform))
sys.path.extend(['/home/cengadmin/git/dhgitlab/sse/engine/fs/programs/pyspark', '/home/cengadmin/git/dhgitlab/sse/engine/fs/programs/pyspark'])
Python 2.7.5 (default, Nov  6 2016, 00:28:07) 
[GCC 4.8.5 20150623 (Red Hat 4.8.5-11)] on linux2
import os
os.environ['PYTHONPATH']
'/opt/.pycharm_helpers/pydev'

如您所见,PyCharm已更改PYTHONPATH,这意味着我无法再使用我想要使用的pyspark库:

from pyspark import SparkContext
Traceback (most recent call last):
  File "<input>", line 1, in <module>
ImportError: No module named pyspark

好的,我可以从控制台更改PATH以使其工作:

import sys
sys.path.append('/opt/spark/python/')
sys.path.append('/opt/spark/python/lib/py4j-0.9-src.zip')

但每次打开控制台时都必须这样做很乏味。我无法相信没有办法告诉PyCharm附加到PYTHONPATH而不是覆盖它,但如果有的话我找不到它。有人可以提供任何建议吗?如何使用Docker镜像作为PyCharm的远程解释器并保持PYTHONPATH的值?

python docker pycharm
1个回答
3
投票

您可以在“首选项”中进行设置见下图Setting the environment setup

您可以设置环境变量,也可以更新“启动脚本”部分。无论哪种方式更适合你,两者都可以胜任

如果您需要进一步的帮助,请阅读以下文章https://www.jetbrains.com/help/pycharm/python-console.html

© www.soinside.com 2019 - 2024. All rights reserved.