Docker Hadoop:语法错误:替换错误起始datanode / namenodes

问题描述 投票:-1回答:1

在Hadoop的docker文件上创建,我在docker文件中按照以下步骤操作:

  1. 拍摄基本图像FROM alpine:3.8
  2. 设置Hadoop所有ENV变量。
  3. https://archive.apache.org/dist/hadoop/core/hadoop-3.1.2/hadoop-3.1.2.tar.gz中的wget
  4. 设置HADOOP_HOME = / usr / local / hadoop
  5. RUN chmod + x“ $ {HADOOP_HOME}” / sbin / start-dfs.sh
  6. CMD bash“ $ {HADOOP_HOME}” / sbin / start-dfs.sh

    $ docker build -t hadoop-local(图像名称)

成功构建2e9bd7068a41已成功标记hadoop-local:latest

但是当我运行它时,抛出提到的异常:

WARNING: HADOOP_PREFIX has been replaced by HADOOP_HOME. Using value of HADOOP_PREFIX.
Starting namenodes on [hadoop-master]
/usr/local/hadoop/bin/hdfs: /usr/local/hadoop/bin/../libexec/hadoop-config.sh: line 46: syntax error: bad substitution
Starting datanodes
/usr/local/hadoop/bin/hdfs: /usr/local/hadoop/bin/../libexec/hadoop-config.sh: line 46: syntax error: bad substitution
Starting secondary namenodes [ca220cce64f9]
/usr/local/hadoop/bin/hdfs: /usr/local/hadoop/bin/../libexec/hadoop-config.sh: line 46: syntax error: bad substitution
WARNING: HADOOP_PREFIX has been replaced by HADOOP_HOME. Using value of HADOOP_PREFIX.

根据https://www.linode.com/docs/databases/hadoop/how-to-install-and-set-up-hadoop-cluster/配置文档。

docker hadoop bigdata devops
1个回答
0
投票

提到的问题是以下版本的原因是我的发现:

3.1.+
substitution issue in same script line 46' (above-mentioned issue)


2.6.0 

localhost: /hadoop-2.7.4/sbin/hadoop-daemon.sh: line 131:   227 Segmentation fault      nohup nice -n $HADOOP_NICENESS $hdfsScript --config $HADOOP_CONF_DIR $command "$@" > "$log" 2>&1 < /dev/null
(https://github.com/docker/for-mac/issues/2492)

我无法找到根本原因,但版本2.6.4对我有用。

系统配置:

Windows 1064位Docker桌面

© www.soinside.com 2019 - 2024. All rights reserved.