scala 导入我的罐子来激发外壳
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/40863603/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Importing my jar to spark shell
提问by Todor Markov
I have a simple scala maven module which is part of a larger project (I created it as described here: https://www.jetbrains.com/help/idea/2016.2/creating-a-maven-module.html):
我有一个简单的 scala maven 模块,它是一个更大项目的一部分(我按照这里的描述创建了它:https: //www.jetbrains.com/help/idea/2016.2/creating-a-maven-module.html):
package com.myorg.simplr
import [...]
@SerialVersionUID(100L)
case class Simplr (){
//class code
}
I am trying to use this class in spark shell, so I built a jar file "simplr-1.0.jar" and launched the spark shell with --jars simplr-1.0.jar.
我试图在 spark shell 中使用这个类,所以我构建了一个 jar 文件“simplr-1.0.jar”并使用 --jars simplr-1.0.jar 启动了 spark shell。
Then, when I try to import, I get the following
然后,当我尝试导入时,我得到以下信息
scala> import com.myorg.simplr.Simplr
<console>:25: error: object myorg is not a member of package com
import com.myorg.simplr.Simplr
^
How can I make the import to work?
我怎样才能使导入工作?
I used maven to build, and here's my pom.xml:
我使用 maven 构建,这是我的 pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>my-parent-project</artifactId>
<groupId>com.myorg</groupId>
<version>1.0</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>simplr</artifactId>
<version>1.0</version>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.6.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>1.6.0</version>
<scope>provided</scope>
</dependency>
</dependencies>
</project>
回答by Sandeep Purohit
Please make sure some below points it will works
1. start spark shell like ./spark-shell --jars jar_path2. There is class file in jar under the same package which you import, open jar and check it.
3. After start spark go to http://localhost:4040/environment/you jar will be in classpath entries or not.
请确保以下几点它会起作用 1. 像./spark-shell --jars jar_path2一样启动 spark shell 。在您导入的同一个包下的 jar 中有类文件,打开 jar 并检查它。3. 启动 spark 后转到http://localhost:4040/environment/你的 jar 将在类路径条目中与否。

