bash 火花提交:找不到命令

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/45745850/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-18 16:23:08  来源:igfitidea点击:

spark-submit: command not found

bashmacosshellenvironment-variablesfish

提问by Fisher Coder

A very simple question:

一个很简单的问题:

I try to use a bash script to submit spark jobs. But somehow it keeps complaining that it cannot find spark-submitcommand. But when I just copy out the command and run directly in my terminal, it runs fine.

我尝试使用 bash 脚本提交 spark 作业。但不知何故,它一直抱怨找不到spark-submit命令。但是当我只是复制命令并直接在我的终端中运行时,它运行良好。

My shell is fish shell, here's what I have in my fish shell config: ~/.config/fish/config.fish:

我的外壳是鱼壳,这是我的鱼壳配置中的内容~/.config/fish/config.fish::

alias spark-submit='/Users/MY_NAME/Downloads/spark-2.0.2-bin-hadoop2.7/bin/spark-submit'

alias spark-submit='/Users/MY_NAME/Downloads/spark-2.0.2-bin-hadoop2.7/bin/spark-submit'

Here's my bash script:

这是我的 bash 脚本:

#!/usr/bin/env bash


SUBMIT_COMMAND="HADOOP_USER_NAME=hdfs spark-submit \
      --master $MASTER \
      --deploy-mode client \
      --driver-memory $DRIVER_MEMORY \
      --executor-memory $EXECUTOR_MEMORY \
      --num-executors $NUM_EXECUTORS \
      --executor-cores $EXECUTOR_CORES \
      --conf spark.shuffle.compress=true \
      --conf spark.network.timeout=2000s \
      $DEBUG_PARAM \
      --class com.fisher.coder.OfflineIndexer \
      --verbose \
      $JAR_PATH \
      --local $LOCAL \
      $SOLR_HOME \
      --solrconfig 'resource:solrhome/' \
      $ZK_QUORUM_PARAM \
      --source $SOURCE \
      --limit $LIMIT \
      --sample $SAMPLE \
      --dest $DEST \
      --copysolrconfig \
      --shards $SHARDS \
      $S3_ZK_ZNODE_PARENT \
      $S3_HBASE_ROOTDIR \
      "

eval "$SUBMIT_COMMAND"

What I've tried: I could run this command perfectly fine on my Mac OS X fish shell when I copy this command literally out and directly run. However, what I wanted to achieve is to be able to run ./submit.sh -localwhich executes the above shell.

我试过的:当我从字面上复制这个命令并直接运行时,我可以在我的 Mac OS X 鱼壳上完美地运行这个命令。但是,我想要实现的是能够运行./submit.sh -local执行上述shell。

Any clues please?

请问有什么线索吗?

回答by Kurtis Rader

You seem to be confused about what a fish alias is. When you run this:

您似乎对鱼别名是什么感到困惑。当你运行这个:

alias spark-submit='/Users/MY_NAME/Downloads/spark-2.0.2-bin-hadoop2.7/bin/spark-submit'

You are actually doing this:

你实际上是在这样做:

function spark-submit
   /Users/MY_NAME/Downloads/spark-2.0.2-bin-hadoop2.7/bin/spark-submit $argv
end

That is, you are defining a fish function. Your bash script has no knowledge of that function. You need to either put that path in your $PATHvariable or put a similar alias command in your bash script.

也就是说,您正在定义一个 fish 函数。您的 bash 脚本不知道该功能。您需要将该路径放在您的$PATH变量中,或者在您的 bash 脚本中放置一个类似的别名命令。