scala 使用 sbt 编译 spark 项目时未解决的依赖问题

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/34047332/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 07:50:32  来源:igfitidea点击:

Unresolved dependency issue when compiling spark project with sbt

scalaapache-sparksbt

提问by Pop

I am trying to compile with sbt 0.13.8a very simple spark project whose only function is

我正在尝试使用sbt 0.13.8一个非常简单的 spark 项目进行编译,它的唯一功能是

Test.scala

测试.scala

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object SimpleApp {
  def main(args: Array[String]) {
    val logFile = "YOUR_SPARK_HOME/README.md" // Should be some file on your system
    val conf = new SparkConf().setAppName("Simple Application")
    val sc = new SparkContext(conf)
    val logData = sc.textFile(logFile, 2).cache()
    val numAs = logData.filter(line => line.contains("a")).count()
    val numBs = logData.filter(line => line.contains("b")).count()
    println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
  }
}

The build.sbtfile in the projet root is as follows:

项目根目录下的build.sbt文件如下:

name := "Test"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.2"

resolvers ++= Seq(
  "Apache Repository" at "https://repository.apache.org/content/repositories/releases/",
  "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/",
  Resolver.sonatypeRepo("public")
)

The error return by sbt compileis:

错误返回sbt compile是:

[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: oro#oro;2.0.8: configuration not found in oro#oro;2.0.8: 'master(compile)'. Missing configuration: 'compile'. It was required from org.apache.spark#spark-core_2.11;1.5.2 compile
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn] 
[warn]  Note: Unresolved dependencies path:
[warn]      oro:oro:2.0.8
[warn]        +- org.apache.spark:spark-core_2.11:1.5.2 (/home/osboxes/Documents/bookings/Test/build.sbt#L7-8)
[warn]        +- default:test_2.11:1.0
sbt.ResolveException: unresolved dependency: oro#oro;2.0.8: configuration not found in oro#oro;2.0.8: 'master(compile)'. Missing configuration: 'compile'. It was required from org.apache.spark#spark-core_2.11;1.5.2 compile
    at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:291)
    at sbt.IvyActions$$anonfun$updateEither.apply(IvyActions.scala:188)
    at sbt.IvyActions$$anonfun$updateEither.apply(IvyActions.scala:165)
    at sbt.IvySbt$Module$$anonfun$withModule.apply(Ivy.scala:155)
    at sbt.IvySbt$Module$$anonfun$withModule.apply(Ivy.scala:155)
    at sbt.IvySbt$$anonfun$withIvy.apply(Ivy.scala:132)
    at sbt.IvySbt.sbt$IvySbt$$action(Ivy.scala:57)
    at sbt.IvySbt$$anon.call(Ivy.scala:65)
    at xsbt.boot.Locks$GlobalLock.withChannel(Locks.scala:93)
    at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries(Locks.scala:78)
    at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock.apply(Locks.scala:97)
    at xsbt.boot.Using$.withResource(Using.scala:10)
    at xsbt.boot.Using$.apply(Using.scala:9)
    at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
    at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
    at xsbt.boot.Locks$.apply0(Locks.scala:31)
    at xsbt.boot.Locks$.apply(Locks.scala:28)
    at sbt.IvySbt.withDefaultLogger(Ivy.scala:65)
    at sbt.IvySbt.withIvy(Ivy.scala:127)
    at sbt.IvySbt.withIvy(Ivy.scala:124)
    at sbt.IvySbt$Module.withModule(Ivy.scala:155)
    at sbt.IvyActions$.updateEither(IvyActions.scala:165)
    at sbt.Classpaths$$anonfun$sbt$Classpaths$$work.apply(Defaults.scala:1369)
    at sbt.Classpaths$$anonfun$sbt$Classpaths$$work.apply(Defaults.scala:1365)
    at sbt.Classpaths$$anonfun$doWork$$anonfun.apply(Defaults.scala:1399)
    at sbt.Classpaths$$anonfun$doWork$$anonfun.apply(Defaults.scala:1397)
    at sbt.Tracked$$anonfun$lastOutput.apply(Tracked.scala:37)
    at sbt.Classpaths$$anonfun$doWork.apply(Defaults.scala:1402)
    at sbt.Classpaths$$anonfun$doWork.apply(Defaults.scala:1396)
    at sbt.Tracked$$anonfun$inputChanged.apply(Tracked.scala:60)
    at sbt.Classpaths$.cachedUpdate(Defaults.scala:1419)
    at sbt.Classpaths$$anonfun$updateTask.apply(Defaults.scala:1348)
    at sbt.Classpaths$$anonfun$updateTask.apply(Defaults.scala:1310)
    at scala.Function1$$anonfun$compose.apply(Function1.scala:47)
    at sbt.$tilde$greater$$anonfun$$u2219.apply(TypeFunctions.scala:40)
    at sbt.std.Transform$$anon.work(System.scala:63)
    at sbt.Execute$$anonfun$submit$$anonfun$apply.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$$anonfun$apply.apply(Execute.scala:226)
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
    at sbt.Execute.work(Execute.scala:235)
    at sbt.Execute$$anonfun$submit.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit.apply(Execute.scala:226)
    at sbt.ConcurrentRestrictions$$anon$$anonfun.apply(ConcurrentRestrictions.scala:159)
    at sbt.CompletionService$$anon.call(CompletionService.scala:28)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
[error] (*:update) sbt.ResolveException: unresolved dependency: oro#oro;2.0.8: configuration not found in oro#oro;2.0.8: 'master(compile)'. Missing configuration: 'compile'. It was required from org.apache.spark#spark-core_2.11;1.5.2 compile

How could I solve this dependency issue?

我怎样才能解决这个依赖问题?

EDIT

编辑

I have followed the @mark91's advice:

我遵循了@mark91 的建议:

  • change scala version to 2.10.5
  • change the spark dependency to libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.2.0-cdh5.3.2" % "provided".
  • 将 Scala 版本更改为 2.10.5
  • 将 spark 依赖项更改为libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.2.0-cdh5.3.2" % "provided".

However I still get an unresolved dependency in org.scala-lang#scala-library;2.10.4 :

但是我仍然在 org.scala-lang#scala-library;2.10.4 中得到一个未解决的依赖:

 [error] sbt.ResolveException: unresolved dependency: org.scala-lang#scala-library;2.10.4: configuration not found in org.scala-lang#scala-library;2.10.4: 'master(compile)'. Missing configuration: 'compile'. It was required from org.apache.spark#spark-core_2.10;1.2.0-cdh5.3.2 compile

Do you have any idea of why I get this problem?

你知道为什么我会遇到这个问题吗?

回答by mgaido

Your problem is that Spark is written with Scala 2.10. So you should use version 2.10 of Scala instead of 2.11.

您的问题是 Spark 是用 Scala 2.10 编写的。所以你应该使用 Scala 的 2.10 版而不是 2.11 版。

Example:

例子:

scalaVersion := "2.10.5"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.2.0-cdh5.3.2" % "provided"


resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/"

回答by nDakota

I had a similar dependency issue and solved it by reloading plugins and updating dependencies. I think your dependency issue is due to Ivy cache. Normally, if no dependency management configuration has changed since the last successful resolution and the retrieved files are still present, sbt does not ask Ivy to perform resolution.

我有一个类似的依赖问题,并通过重新加载插件和更新依赖解决了它。我认为你的依赖问题是由于常春藤缓存。通常,如果自上次成功解析以来没有任何依赖管理配置更改并且检索到的文件仍然存在,则 sbt 不会要求 Ivy 执行解析。

Try running:

尝试运行:

sbt reload plugins
sbt update
sbt reload

If that doesn't work, follow instructions on http://www.scala-sbt.org/0.13/docs/Dependency-Management-Flow.html

如果这不起作用,请按照http://www.scala-sbt.org/0.13/docs/Dependency-Management-Flow.html 上的说明进行操作

回答by Aman Adhikari

Same Reason as @mgaido: Using different version of Scala cause this problem. But sometime, even we change later to correct version of scala, our cache still might have the older one. In that case we can delete ~/.ivy2folder.

与@mgaido 相同的原因:使用不同版本的 Scala 会导致此问题。但有时,即使我们稍后更改为正确版本的 Scala,我们的缓存仍然可能具有较旧的版本。在这种情况下,我们可以删除~/.ivy2文件夹。