scala 错误:无效或损坏的 jarfile sbt/sbt-launch-0.13.5.jar

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/31594937/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 07:23:37  来源:igfitidea点击:

Error: Invalid or corrupt jarfile sbt/sbt-launch-0.13.5.jar

scalaapache-spark

提问by user2330778

I have been trying to install spark using the tutorialand everytime I run the command sbt/sbt assembly, I get the error "Error: Invalid or corrupt jarfile sbt/sbt-launch-0.13.5.jar"

我一直在尝试使用教程安装 spark ,每次运行命令 sbt/sbt assembly 时,我都会收到错误“错误:无效或损坏的 jarfile sbt/sbt-launch-0.13.5.jar”

I have tried everything: seperately adding the sbt file to the sbt folder in the spark folder, installing sbt individually, checking the download and reinstalling it over again, but in vain. Any advice about what I am doing wrong? Thanks.

我已经尝试了所有方法:将sbt文件单独添加到spark文件夹中的sbt文件夹中,单独安装sbt,检查下载并重新安装它,但徒劳无功。关于我做错了什么的任何建议?谢谢。

回答by Frozenfire

Ok, After playing around for a while I finally got it and hopefully this will work for you aswell. That tutorial builds spark, where they do provide prebuilt binaries. I'm using Spark 1.2.0 just as a note (1.4.1 wouldn't work for me)

好的,玩了一段时间后我终于明白了,希望这对你也有用。该教程构建了 spark,它们确实提供了预构建的二进制文件。我使用 Spark 1.2.0 作为注释(1.4.1 对我不起作用)

This is on Ubuntu 15.04 but should work on 14.04 the same

这是在 Ubuntu 15.04 上,但应该在 14.04 上同样工作

1) Remove the following lines from your bashrc

1) 从你的 bashrc 中删除以下几行

export SCALA_HOME=/usr/local/src/scala/scala-2.10.4
export PATH=$SCALA_HOME/bin:$PATH

2) Remove and reinstall scala

2)删除并重新安装scala

sudo rm -rf /usr/local/src/scala
# The following line is only needed if you installed scala another way, if so remove the #
# sudo apt-get remove scala-library scala
wget http://www.scala-lang.org/files/archive/scala-2.11.7.deb
sudo dpkg -i scala-2.11.7.deb
sudo apt-get update
sudo apt-get install scala

3) Download PreBuiltSpark and extract

3) 下载PreBuiltSpark 并解压

wget http://d3kbcqa49mib13.cloudfront.net/spark-1.2.0-bin-hadoop2.4.tgz
tar -xzvf spark-1.2.0-bin-hadoop2.4.tgz 

4) Run spark-shell

4) 运行 spark-shell

cd spark-1.2.0-bin-hadoop2.4/
./bin/spark-shell

Sources (basically where I've read from, this solution has been trial and error)

来源(基本上是我从哪里读到的,这个解决方案一直在反复试验)

https://chongyaorobin.wordpress.com/2015/07/01/step-by-step-of-installing-apache-spark-on-apache-hadoop/
https://gist.github.com/visenger/5496675

https://chongyaorobin.wordpress.com/2015/07/01/step-by-step-of-installing-apache-spark-on-apache-hadoop/
https://gist.github.com/visenger/5496675

回答by Jyoti Gupta

If you have downloaded spark package from http://d3kbcqa49mib13.cloudfront.net/spark-1.1.0.tgzthen cross check file - "sbt/sbt-launch-0.13.5.jar". If it just contains small (5-6lines) html content then you need to download jar file manually. This html file just indicate that required jar file was not found. You may use follow following steps for centos:

如果您从http://d3kbcqa49mib13.cloudfront.net/spark-1.1.0.tgz下载了 spark 包,请交叉检查文件 - “sbt/sbt-launch-0.13.5.jar”。如果它只包含小的(5-6 行)html 内容,那么您需要手动下载 jar 文件。此 html 文件仅表示未找到所需的 jar 文件。对于centos,您可以使用以下步骤:

  1. Download jar manually:
    wget http://dl.bintray.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/0.13.1/sbt-launch.jar ./sbt/sbt-launch-0.13.5.jar
  2. Prevent automatic downloading of jar file:
    sed -i '47,68s/^/#/' sbt/sbt-launch-lib.bash
  3. Install spark again:
    sbt/sbt assembly
  1. 手动下载jar:
    wget http://dl.bintray.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/0.13.1/sbt-launch.jar ./sbt/sbt-launch-0.13.5.jar
  2. 防止自动下载jar文件:
    sed -i '47,68s/^/#/' sbt/sbt-launch-lib.bash
  3. 再次安装火花:
    sbt/sbt assembly

It worked for me without altering scala installation. Hope it helps.

它对我有用,而无需更改 Scala 安装。希望能帮助到你。

回答by Christos Kozanitis

The sbt script does not download properly sbt-launch-0.13.5.jar because there must be something wrong with the URLs it is using. As a result the file that it downloads contains just an HTML header (wither 400 or 302 codes). Until a better solution becomes available, as a workaround I would download manually sbt-launch-0.13.5.jar beforehand.

sbt 脚本没有正确下载 sbt-launch-0.13.5.jar,因为它使用的 URL 肯定有问题。因此,它下载的文件只包含一个 HTML 标题(带有 400 或 302 代码)。在有更好的解决方案可用之前,作为一种解决方法,我会事先手动下载 sbt-launch-0.13.5.jar。

回答by prabeesh

In the SPARK_HOME/sbt/sbt-launch-lib.bashscript replace line 53 to line 57 with following

SPARK_HOME/sbt/sbt-launch-lib.bash脚本中,将第 53 行到第 57 行替换为以下内容

if hash curl 2>/dev/null; then
  (curl --fail --location --silent ${URL1} > ${JAR_DL} ||\
   (rm -f "${JAR_DL}" && curl --fail --location --silent ${URL2} > ${JAR_DL})) && \
   mv "${JAR_DL}" "${JAR}"
elif hash wget 2>/dev/null; then
  (wget --quiet ${URL1} -O ${JAR_DL} ||\
   (rm -f "${JAR_DL}" && wget --quiet ${URL2} -O ${JAR_DL})) &&\
   mv "${JAR_DL}" "${JAR}"
else

Then try again, run the sbt assembly command

然后再试一次,运行sbt assembly命令

sbt/sbt assembly

Simplest method is install sbt manually as follows

最简单的方法是手动安装sbt如下

download sbt deb file

下载 sbt deb 文件

wget http://dl.bintray.com/sbt/debian/sbt-0.13.5.deb

Then run

然后运行

sudo dpkg -i sbt-0.13.5.deb
sudo apt-get update
sudo apt-get install sbt

then build using sbt assemblyinstead of sbt/sbt assemblyfrom spark home folder

然后使用sbt assembly而不是sbt/sbt assembly从 spark 主文件夹构建

回答by Pierre Cordier

@Frozenfire, I'am not sure if it's possible but the Spark documentation Overviewsays :

@Frozenfire,我不确定是否可能,但Spark 文档概述说:

For the Scala API, Spark 1.4.1 uses Scala 2.10. You will need to use a compatible Scala version (2.10.x).

对于 Scala API,Spark 1.4.1 使用 Scala 2.10。您将需要使用兼容的 Scala 版本 (2.10.x)。

And I wonder if it would be the reason why you have this problem:

我想知道这是否是您遇到此问题的原因:

I'm using Spark 1.2.0 just as a note (1.4.1 wouldn't work for me)

我使用 Spark 1.2.0 作为注释(1.4.1 对我不起作用)

Because you do :

因为你这样做:

sudo dpkg -i scala-2.11.7.deb

which downloads and installs scala-2.11.7.

下载并安装scala-2.11.7.

I don't know but this might be a clue !

我不知道,但这可能是一个线索!

PS1: this is more a comment to Frozenfire's answer, but I can't comment because of a lack of reputation and I wanted to share this.

PS1:这更像是对 Frozenfire 答案的评论,但由于缺乏声誉,我无法发表评论,我想分享这个。

PS2: Building for Scala 2.11

PS2:为 Scala 2.11 构建