scala 如何在 spark-shell 中运行外部 jar 函数

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/40254319/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 08:47:25  来源:igfitidea点击:

How to run external jar functions in spark-shell

scalaapache-spark

提问by reza

I created a jar package from a project by this file-tree:

我通过这个文件树从一个项目创建了一个 jar 包:

build.sbt
src/main
src/main/scala
src/main/scala/Tester.scala
src/main/scala/main.scala

where Tester is a class by a function (name is print()) and main has an object to run that prints "Hi!" (from spark documention) created a jar file by sbt successfully and worked well in spark-submit

其中 Tester 是一个函数的类(名称是 print()),main 有一个要运行的对象,它会打印“嗨!” (来自 spark 文档)通过 sbt 成功创建了一个 jar 文件并且在 spark-submit 中运行良好

now I wanna add it into spark-shell and use Tester class as a class to create objects and ... I added the jar file into spark-default.conf but:

现在我想将它添加到 spark-shell 并使用 Tester 类作为创建对象的类......我将 jar 文件添加到 spark-default.conf 但是:

scala> val t = new Tester();
<console>:23: error: not found: type Tester
       val t = new Tester();

回答by Sandeep Purohit

you can try by providing jars with argument as below

您可以尝试通过提供带有参数的 jars 如下

./spark-shell --jars pathOfjarsWithCommaSeprated

Or you can add following configuration in you spark-defaults.conf but remember to remove template from end of spark-defaults

或者您可以在 spark-defaults.conf 中添加以下配置,但请记住从 spark-defaults 的末尾删除模板

spark.driver.extraClassPath  pathOfJarsWithCommaSeprated

回答by Sam Malayek

If you want to add a .jar to the classpath after you've entered spark-shell, use :require. Like:

如果您想在进入 spark-shell 后将 .jar 添加到类路径,请使用:require. 喜欢:

scala> :require /path/to/file.jar
Added '/path/to/file.jar' to classpath.

回答by Priyanshu Singh

I tried two options and both worked for me.

我尝试了两种选择,都对我有用。

  1. spark-shell --jars <path of jar>
  2. open spark-shell -Type :help,you will get all the available help. use below to add

    :require /full_path_of_jar

  1. spark-shell --jars <path of jar>
  2. 打开 spark-shell -Type :help,您将获得所有可用的帮助。使用下面添加

    :require /full_path_of_jar

enter image description here

在此处输入图片说明