scala Apache Spark 项目的“./sbt/sbt assembly”错误“Not a valid command: assembly”
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/21245669/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
"./sbt/sbt assembly" errors "Not a valid command: assembly" for Apache Spark project
提问by deepblue
I'm having trouble with installing Apache Spark on Ubuntu 13.04. Im using spark-0.8.1-incubating, and both ./sbt/sbt updateand ./sbt/sbt compilework fine. However, when I do a ./sbt/sbt assemblyI get the following error:
我在 Ubuntu 13.04 上安装 Apache Spark 时遇到问题。即时通讯使用火花0.8.1-培养,都./sbt/sbt update和./sbt/sbt compile做工精细。但是,当我执行 a 时,出现./sbt/sbt assembly以下错误:
[info] Set current project to default-289e76 (in build file:/node-insights/server/lib/spark-0.8.1-incubating/sbt/)
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Not a valid configuration: assembly
[error] Not a valid key: assembly
[error] assembly
[error]
I googled for stuff related to this but couldn't find anything useful. Any guidance would be much appreciated.
我用谷歌搜索了与此相关的东西,但找不到任何有用的东西。任何指导将不胜感激。
回答by Josh Rosen
The current project set to default-289e76message suggests that sbtwas called from outside of the Spark sources directory:
该current project set to default-289e76消息表明sbt是从 Spark 源目录外部调用的:
$ /tmp ./spark-0.8.1-incubating/sbt/sbt assembly
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins/project
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins
[info] Set current project to default-d0f036 (in build file:/private/tmp/)
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Not a valid configuration: assembly
[error] Not a valid key: assembly
[error] assembly
[error] ^
Running ./sbt/sbt assemblyworks fine from the spark-0.8.1-incubatingdirectory (note the log output showing that the current project was set correctly):
./sbt/sbt assembly从spark-0.8.1-incubating目录运行运行正常(注意日志输出显示当前项目设置正确):
$ spark-0.8.1-incubating sbt/sbt assembly
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins/project
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins
[info] Loading project definition from /private/tmp/spark-0.8.1-incubating/project/project
[info] Loading project definition from /private/tmp/spark-0.8.1-incubating/project
[info] Set current project to root (in build file:/private/tmp/spark-0.8.1-incubating/)
...
回答by swartzrock
You typed "abt" twice, but shouldn't that be "sbt"? Apache Spark has its own copy of sbt, so make sure you're running Spark's version to pick up the "assembly" plugin among other customizations.
你输入了两次“abt”,但不应该是“sbt”吗?Apache Spark 有自己的 sbt 副本,因此请确保您正在运行 Spark 的版本以在其他自定义项中选择“程序集”插件。
To run the Spark installation of sbt, go to the Spark directory and run ./sbt/sbt assembly.
要运行 sbt 的 Spark 安装,请转到 Spark 目录并运行./sbt/sbt assembly.

