scala sbt 在运行 Spark hello world 代码时出错?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/43873639/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 09:14:03  来源:igfitidea点击:

sbt got error when run Spark hello world code?

scalaapache-sparksbt

提问by ca9163d9

I got the following error when running a spark hello world program.

运行 spark hello world 程序时出现以下错误。

[info] Updating {file:/C:/Users/user1/IdeaProjects/sqlServer/}sqlserver...
[info] Resolving org.apache.spark#spark-core_2.12;2.1.1 ...
[warn]  module not found: org.apache.spark#spark-core_2.12;2.1.1
[warn] ==== local: tried
[warn]   C:\Users\user1\.ivy2\local\org.apache.spark\spark-core_2.12.1.1\ivys\ivy.xml
[warn] ==== public: tried
[warn]   https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.12/2.1.1/spark-core_2.12-2.1.1.pom
[warn] ==== local-preloaded-ivy: tried
[warn]   C:\Users\user1\.sbt\preloaded\org.apache.spark\spark-core_2.12.1.1\ivys\ivy.xml
[warn] ==== local-preloaded: tried
[warn]   file:/C:/Users/user1/.sbt/preloaded/org/apache/spark/spark-core_2.12/2.1.1/spark-core_2.12-2.1.1.pom
[info] Resolving jline#jline;2.14.3 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core_2.12;2.1.1: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn]  Note: Unresolved dependencies path:
[warn]          org.apache.spark:spark-core_2.12:2.1.1 (C:\Users\user1\IdeaProjects\sqlServer\build.sbt#L7-8)
[warn]            +- mpa:mpa_2.12:1.0
[trace] Stack trace suppressed: run last *:update for the full output.
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.1.1: not found
[error] Total time: 1 s, completed May 9, 2017 11:05:44 AM

Here is the build.sbt,

这是build.sbt

name := "Mpa"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1"

My Spark webcome message.

我的 Spark webcom 消息。

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.1.1
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_111)
Type in expressions to have them evaluated.
Type :help for more information.

Update:

更新:

I changed the built.sbtto

我改变了built.sbt

name := "Mpa"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core_2.11" % "2.1.0"

But still got

但还是得到了

[info] Updating {file:/C:/Users/user1/IdeaProjects/sqlServer/}sqlserver...
[info] Resolving org.apache.spark#spark-core_2.11_2.11;2.1.0 ...
[warn]  module not found: org.apache.spark#spark-core_2.11_2.11;2.1.0
[warn] ==== local: tried
[warn]   C:\Users\user1\.ivy2\local\org.apache.spark\spark-core_2.11_2.11.1.0\ivys\ivy.xml
[warn] ==== public: tried
[warn]   https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11_2.11/2.1.0/spark-core_2.11_2.11-2.1.0.pom
[warn] ==== local-preloaded-ivy: tried
[warn]   C:\Users\user1\.sbt\preloaded\org.apache.spark\spark-core_2.11_2.11.1.0\ivys\ivy.xml
[warn] ==== local-preloaded: tried
[warn]   file:/C:/Users/user1/.sbt/preloaded/org/apache/spark/spark-core_2.11_2.11/2.1.0/spark-core_2.11_2.11-2.1.0.pom
[info] Resolving jline#jline;2.12.1 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core_2.11_2.11;2.1.0: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn]  Note: Unresolved dependencies path:
[warn]          org.apache.spark:spark-core_2.11_2.11:2.1.0 (C:\Users\user1\IdeaProjects\sqlServer\build.sbt#L7-8)
[warn]            +- mpa:mpa_2.11:1.0
[trace] Stack trace suppressed: run last *:update for the full output.
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.11_2.11;2.1.0: not found
[error] Total time: 1 s, completed May 9, 2017 1:01:01 PM

回答by ktheitroadalo

Your have an error in built.sbt file, you must change %%to %:

您在 built.sbt 文件中有错误,您必须更改%%%

name := "Mpa"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" % "spark-core" % "2.1.1"

%%asks Sbt to add the current scala version to the artifact

%%要求 Sbt 将当前的 Scala 版本添加到工件中

You can spark-core_2.11with %to get the issue solved.

您可以spark-core_2.11使用%来解决问题。

// https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"

Hope this helps!

希望这可以帮助!

回答by u6939919

I got the same error.

我得到了同样的错误。

build.sbt

生成.sbt

name := "Simple Project"  
version := "1.0"  
scalaVersion := "2.12.3"  
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"  

just change scalaVersion to 2.11.8 or lower. And it works.

只需将 scalaVersion 更改为 2.11.8 或更低。它有效。

回答by Leon

I got the same error and resolved it by below steps. Basically, the filename did not match the sbt configuration.
- Check filename of the spark core jar in $SPARK_HOME/jars ( it is spark-core_2.11-2.1.1.jar).
- Install scala 2.11.11.
- Edit build.sbt to scalaVersion := "2.11.11".

我遇到了同样的错误并通过以下步骤解决了它。基本上,文件名与 sbt 配置不匹配。
-在$ SPARK_HOME火花核心罐子检查文件名/瓶(这是火花core_ 2.11-2.1.1.jar)。
- 安装 Scala 2.11.11。
- 将build.sbt编辑为scalaVersion := " 2.11.11"

回答by Murali Bala

This worked for me. Sample build.sbt

这对我有用。示例 build.sbt

name := "testproj"

version := "0.1"

scalaVersion := "2.11.9"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"

回答by vijayraj34

SparkSessionis available in spark-sqllibrary. You must've to add spark-sqldependency to the build.

SparkSessionspark-sql库中可用。您必须将spark-sql依赖项添加到构建中。

libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.1"

回答by the775

Versioning Pair that works for 2.11.12.

适用于 2.11.12 的版本对。

scalaVersion := "2.11.12"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.3.2",
  "org.apache.spark" %% "spark-sql" % "2.3.2"
)