scala 想法 sbt java.lang.NoClassDefFoundError: org/apache/spark/SparkConf
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/37849408/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
idea sbt java.lang.NoClassDefFoundError: org/apache/spark/SparkConf
提问by Banehallow
I'm a beginner of spark.I build an environment use "linux + idea + sbt" ,when I try the quick start of Spark,I get the problem:
我是spark的初学者。我用“linux + idea + sbt”搭建环境,当我尝试快速启动Spark时,出现问题:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkConf
at test$.main(test.scala:11)
at test.main(test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
The versions of them in my disk:
它们在我磁盘中的版本:
sbt = 0.13.11
jdk = 1.8
scala = 2.10
idea = 2016
My directory structure:
我的目录结构:
test/
idea/
out/
project/
build.properties
plugins.sbt
src/
main/
java/
resources/
scala/
scala-2.10/
test.scala
target/
assembly.sbt
build.sbt
In build.properties:
在 build.properties 中:
sbt.version = 0.13.8
In plugins.sbt:
在 plugins.sbt 中:
logLevel := Level.Warn
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")
In build.sbt:
在 build.sbt 中:
import sbt._
import Keys._
import sbtassembly.Plugin._
import AssemblyKeys._
name := "test"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.1" % "provided"
In assembly.sbt:
在 assembly.sbt 中:
import AssemblyKeys._ // put this at the top of the file
assemblySettings
In test.scala:
在 test.scala 中:
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object test {
def main(args: Array[String]) {
val logFile = "/opt/spark-1.6.1-bin-hadoop2.6/README.md" // Should be some file on your system
val conf = new SparkConf().setAppName("Test Application")
val sc = new SparkContext(conf)
val logData = sc.textFile(logFile, 2).cache()
val numAs = logData.filter(line => line.contains("a")).count()
val numBs = logData.filter(line => line.contains("b")).count()
println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
}
}
How can I solve this problem.
我怎么解决这个问题。
回答by Sergey
Dependencies with "provided"scope are only available during compilation and testing, and are not available at runtime or for packaging. So, instead of making an object testwith a main, you should make it an actual test suite placed in src/test/scala(If you're not familiar with unit-testing in Scala, I'd suggest to use ScalaTest, for example. First add a dependency on it in your build.sbt: libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % Testand then go for this quick start tutorialto implement a simple spec).
具有"provided"作用域的依赖项仅在编译和测试期间可用,在运行时或打包时不可用。因此,test与其用 a创建一个对象main,您应该将其放置在一个实际的测试套件中src/test/scala(如果您不熟悉 Scala 中的单元测试,我建议您使用 ScalaTest,例如。首先添加对它在你的 build.sbt: 中libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % Test,然后去这个快速入门教程来实现一个简单的规范)。
Another option, which is quite hacky, in my opinion (but does the trick nonetheless), involves removing providedscope from your spark-coredependency in some configurations and is described in the accepted answer to this question.
在我看来,另一个选项非常笨拙(但仍然有效),涉及provided从spark-core某些配置中的依赖项中删除范围,并在此问题的已接受答案中进行了描述。
回答by Jared
In intelliJ version 2018.1 there is a checkbox in the run configuration called "Include dependencies with "Provided" scope". Checking this option solved it for me.
在 IntelliJ 2018.1 版中,运行配置中有一个复选框,名为“包含具有“提供”范围的依赖项”。检查此选项为我解决了它。
回答by user3485352
I had same issue this morning with the error provided. I removed "provided" and ran sbt clean, reload, compile, package, run . I also test using spark-submit from command line. But I think "provided", the extra overhead on code, jar is less.
今天早上我遇到了同样的问题,并提供了错误。我删除了“提供”并运行 sbt clean, reload, compile, package, run 。我还使用命令行中的 spark-submit 进行测试。但我认为“提供”,代码上的额外开销,jar 更少。

