首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >不能回显码头容器中的env var

不能回显码头容器中的env var
EN

Stack Overflow用户
提问于 2019-10-10 07:56:58
回答 1查看 85关注 0票数 0

我构建了一个包含许多环境变量的坞映像,其中包括一个名为SPARK_HOME的环境变量。下面是声明env的Dockerfile中的一行:

代码语言:javascript
复制
ENV SPARK_HOME="/opt/spark"

当我发出docker run时,我可以看到env存在,但是对它的任何引用都不会返回任何内容,如简单的echo所示

代码语言:javascript
复制
$ docker run --rm myimage /bin/bash -c "env | grep SPARK_HOME ; echo SPARK_HOME=$SPARK_HOME"
SPARK_HOME=/opt/spark
SPARK_HOME=
$

我是不是漏掉了什么明显的东西?为什么不能引用现有env的值?

编辑1:根据注释中的要求,Dockerfile内容包含在下面的中断下面。

编辑2:发现如果我以交互方式运行容器,就可以引用var

代码语言:javascript
复制
$ docker run --rm -it myimage /bin/bash 
root@419dd5f13a6f:/tmp# echo $SPARK_HOME
/opt/spark

代码语言:javascript
复制
FROM our.internal.artifact.store/python:3.7-stretch

WORKDIR /tmp

ENV SPARK_VERSION=2.2.1
ENV HADOOP_VERSION=2.8.4

ARG ARTIFACTORY_USER
ARG ARTIFACTORY_ENCRYPTED_PASSWORD
ARG ARTIFACTORY_PATH=our.internal.artifact.store/artifactory/generic-dev/ceng/external-dependencies
ARG SPARK_BINARY_PATH=https://${ARTIFACTORY_PATH}/spark-${SPARK_VERSION}-bin-hadoop2.7.tgz
ARG HADOOP_BINARY_PATH=https://${ARTIFACTORY_PATH}/hadoop-${HADOOP_VERSION}.tar.gz


ADD files/apt-transport-https_1.4.8_amd64.deb /tmp

RUN echo "deb https://username:password@our.internal.artifact.store/artifactory/debian-main-remote stretch main" >/etc/apt/sources.list.d/main.list &&\
    echo "deb https://username:password@our.internal.artifact.store/artifactory/maria-db-debian stretch main" >>/etc/apt/sources.list.d/main.list &&\
    echo 'Acquire::CompressionTypes::Order:: "gz";' > /etc/apt/apt.conf.d/02update &&\
    echo 'Acquire::http::Timeout "10";' > /etc/apt/apt.conf.d/99timeout &&\
    echo 'Acquire::ftp::Timeout "10";' >> /etc/apt/apt.conf.d/99timeout &&\
    dpkg -i /tmp/apt-transport-https_1.4.8_amd64.deb &&\
    apt-get install --allow-unauthenticated -y /tmp/apt-transport-https_1.4.8_amd64.deb &&\
    apt-get update --allow-unauthenticated -y -o Dir::Etc::sourcelist="sources.list.d/main.list" -o Dir::Etc::sourceparts="-" -o APT::Get::List-Cleanup="0"


RUN apt-get update && \
    apt-get -y install default-jdk

# Detect JAVA_HOME and export in bashrc.
# This will result in something like this being added to /etc/bash.bashrc
#   export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
RUN echo export JAVA_HOME="$(readlink -f /usr/bin/java | sed "s:/jre/bin/java::")" >> /etc/bash.bashrc

# Configure Spark-${SPARK_VERSION}
RUN curl --fail -u "${ARTIFACTORY_USER}:${ARTIFACTORY_ENCRYPTED_PASSWORD}" -X GET "${SPARK_BINARY_PATH}" -o /opt/spark-${SPARK_VERSION}-bin-hadoop2.7.tgz \
    && cd /opt \
    && tar -xvzf /opt/spark-${SPARK_VERSION}-bin-hadoop2.7.tgz \
    && rm spark-${SPARK_VERSION}-bin-hadoop2.7.tgz \
    && ln -s spark-${SPARK_VERSION}-bin-hadoop2.7 spark \
    && sed -i '/log4j.rootCategory=INFO, console/c\log4j.rootCategory=CRITICAL, console' /opt/spark/conf/log4j.properties.template \
    && mv /opt/spark/conf/log4j.properties.template /opt/spark/conf/log4j.properties \
    && mkdir /opt/spark-optional-jars/ \
    && mv /opt/spark/conf/spark-defaults.conf.template /opt/spark/conf/spark-defaults.conf \
    && printf "spark.driver.extraClassPath /opt/spark-optional-jars/*\nspark.executor.extraClassPath /opt/spark-optional-jars/*\n">>/opt/spark/conf/spark-defaults.conf \
    && printf "spark.driver.extraJavaOptions -Dderby.system.home=/tmp/derby" >> /opt/spark/conf/spark-defaults.conf

# Configure Hadoop-${HADOOP_VERSION}
RUN curl --fail -u "${ARTIFACTORY_USER}:${ARTIFACTORY_ENCRYPTED_PASSWORD}" -X GET "${HADOOP_BINARY_PATH}" -o /opt/hadoop-${HADOOP_VERSION}.tar.gz \
    && tar -xvzf /opt/hadoop-${HADOOP_VERSION}.tar.gz \
    && rm /opt/hadoop-${HADOOP_VERSION}.tar.gz \
    && ln -s hadoop-${HADOOP_VERSION} hadoop

# Set Environment Variables.
ENV SPARK_HOME="/opt/spark" \
    HADOOP_HOME="/opt/hadoop" \
    PYSPARK_SUBMIT_ARGS="--master=local[*] pyspark-shell --executor-memory 1g --driver-memory 1g --conf spark.ui.enabled=false spark.executor.extrajavaoptions=-Xmx=1024m" \
    PYTHONPATH="/opt/spark/python:/opt/spark/python/lib/py4j-0.10.7-src.zip:$PYTHONPATH" \
    PATH="$PATH:/opt/spark/bin:/opt/hadoop/bin" \
    PYSPARK_DRIVER_PYTHON="/usr/local/bin/python" \
    PYSPARK_PYTHON="/usr/local/bin/python"

# Upgrade pip and setuptools
RUN pip install --index-url https://username:password@our.internal.artifact.store/artifactory/api/pypi/pypi-virtual-all/simple --upgrade pip setuptools

# Install core python packages
RUN pip install --index-url https://username:password@our.internal.artifact.store/artifactory/api/pypi/pypi-virtual-all/simple pipenv

ADD Pipfile /tmp

ADD pysparkdf_helloworld.py /tmp
EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2019-10-10 08:43:26

好吧,与我的评论相反,这一点都不奇怪。

问题是您的本地shell在将它发送到容器之前已经插入了$SPARK_HOME,所以您基本上是在调用echo SPARK_HOME=

要修复,只需在命令中转义env var:$SPARK_HOME->\$SPARK_HOME

演示:

代码语言:javascript
复制
$ export SPARK_HOME=foo
$ docker run ... /bin/bash -c "env | grep SPARK_HOME ; echo SPARK_HOME=$SPARK_HOME"                                                            
> SPARK_HOME=/opt/spark
> SPARK_HOME=foo
票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/58318041

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档