scala 尝试创建 jar 时出现未解决的依赖项错误
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/25285855/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
UNRESOLVED DEPENDENCIES error while trying to create jar
提问by Y.Prithvi
I'm trying to build a Scala jar file to run it in spark.
I'm following this tutorial.
when trying to build jar file using sbt as here, i'm facing with following error
我正在尝试构建一个 Scala jar 文件以在 spark 中运行它。
我正在关注本教程。
用SBT作为试图建立JAR文件时这里,我面临着以下错误
[info] Resolving org.apache.spark#spark-core_2.10.4;1.0.2 ...
[warn] module not found: org.apache.spark#spark-core_2.10.4;1.0.2
[warn] ==== local: tried
[warn] /home/hduser/.ivy2/local/org.apache.spark/spark-core_2.10.4/1.0.2/ivys/ivy.xml
[warn] ==== Akka Repository: tried
[warn] http://repo.akka.io/releases/org/apache/spark/spark-core_2.10.4/1.0.2/spark-core_2.10.4-1.0.2.pom
[warn] ==== public: tried
[warn] http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10.4/1.0.2/spark-core_2.10.4-1.0.2.pom
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.apache.spark#spark-core_2.10.4;1.0.2: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[error] {file:/home/prithvi/scala/asd/}default-d57abf/*:update: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10.4;1.0.2: not found
[error] Total time: 2 s, completed 13 Aug, 2014 5:24:24 PM
what's the issue and how to solve it.
这是什么问题以及如何解决它。
Dependency issue has been resolved. Thank you "om-nom-nom"
but new error arised
依赖问题已解决。谢谢“om-nom-nom”
但出现新错误
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: FAILED DOWNLOADS ::
[warn] :: ^ see resolution messages for details ^ ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.eclipse.jetty.orbit#javax.transaction;1.1.1.v201105210645!javax.transaction.orbit
[warn] :: org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.orbit
[warn] :: org.eclipse.jetty.orbit#javax.mail.glassfish;1.4.1.v201005082020!javax.mail.glassfish.orbit
[warn] :: org.eclipse.jetty.orbit#javax.activation;1.1.0.v201105071233!javax.activation.orbit
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[error] {file:/home/prithvi/scala/asd/}default-c011e4/*:update: sbt.ResolveException: download failed: org.eclipse.jetty.orbit#javax.transaction;1.1.1.v201105210645!javax.transaction.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.mail.glassfish;1.4.1.v201005082020!javax.mail.glassfish.orbit
[error] download failed: org.eclipse.jetty.orbit#javax.activation;1.1.0.v201105071233!javax.activation.orbit
[error] Total time: 855 s, completed 14 Aug, 2014 12:28:33 PM
回答by om-nom-nom
You have your dependency defined as
你有你的依赖定义为
"org.apache.spark" %% "spark-core" % "1.0.2"
That %%instructs sbt to substitute current scala version to artifact name. Apparently, spark was build for the whole family of 2.10scala, without specific jars for 2.10.1, 2.10.2 ...
这%%指示 sbt 将当前的 scala 版本替换为工件名称。显然,spark 是为 2.10scala的整个家族构建的,没有为 2.10.1、2.10.2 提供特定的 jar ......
So all you have to do is to redefine it as:
所以你所要做的就是将它重新定义为:
"org.apache.spark" % "spark-core_2.10" % "1.0.2"
回答by Danylo Zherebetskyy
I had the same issue. Looks like that some bugs are in different versions/compilations/etc.
我遇到过同样的问题。看起来有些错误在不同的版本/编译/等中。
For me the following build.sbt worked fine
对我来说,以下 build.sbt 工作正常
name := "My Project"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.2"
Hope it helps
希望能帮助到你
回答by nat
spark-core_2.10.4;1.0.2 means that it is build on top of scala 2.10 vesion. so you have to specified this scalaVersion := "2.10.4" in your build file. Please check your .sbt file and change accordingly.
spark-core_2.10.4;1.0.2 意味着它建立在 Scala 2.10 版本之上。所以你必须在你的构建文件中指定这个 scalaVersion := "2.10.4" 。请检查您的 .sbt 文件并进行相应更改。
回答by yang jun
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.1.0",
"org.eclipse.jetty.orbit" % "javax.servlet" % "3.0.0.v201112011016",
"org.eclipse.jetty.orbit" % "javax.transaction" % "1.1.1.v201105210645",
"org.eclipse.jetty.orbit" % "javax.mail.glassfish" % "1.4.1.v201005082020"
)
)
回答by imonaboat
How can you change the current dependencies? I mean, when you type sbt package for a build file like:
如何更改当前的依赖项?我的意思是,当您为构建文件键入 sbt package 时,例如:
name := "Simple Project"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
SBT will start resolving and downloading all kinds of dependencies. But if you see that it is failing on a dependency that is no longer inthe maven repo, what to do? Where can you change the dpencies it tries.
SBT 将开始解析和下载各种依赖项。但是,如果您发现它对不再存在于 Maven 存储库中的依赖项失败,该怎么办?你在哪里可以更改它尝试的 dpencies。
@OP: The problem is that your SBT is outdated. If you downloaded it using apt, you can use apt to remove it as well. In any case, download the latest .tgz (not the .deb) and simply unpack it, after that add the /sbt/bin/ folder to your .bashrc I noticed that older SBT's (the .deb and apt-get versions) work with older scala versions. You either need to manually add or change the dependencies that the older SBT is trying to find or simply change to the latest (not sooo)SBT.
@OP:问题是你的 SBT 已经过时了。如果您使用 apt 下载它,您也可以使用 apt 将其删除。在任何情况下,下载最新的 .tgz(不是 .deb)并简单地解压它,然后将 /sbt/bin/ 文件夹添加到您的 .bashrc 我注意到旧的 SBT(.deb 和 apt-get 版本)工作使用较旧的 Scala 版本。您要么需要手动添加或更改较旧的 SBT 试图查找的依赖项,要么只需更改为最新的(不那么)SBT。

