如何在 Eclipse 中将 Spark 添加到 Maven 项目?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/15211074/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-10 21:35:07  来源:igfitidea点击:

How to add Spark to Maven project in Eclipse?

eclipsemavenspark-java

提问by beam022

I would like to start Sparkproject in Eclipse using Maven. I've installed m2eclipse and I have a working HelloWorld Java application in my Maven project.

我想使用 Maven 在 Eclipse 中启动Spark项目。我已经安装了 m2eclipse,并且在我的 Maven 项目中有一个可用的 HelloWorld Java 应用程序。

I would like to use Spark framework and I'm following directions from the official site. I've added Spark repository to my pom.xml:

我想使用 Spark 框架,我正在遵循官方网站的指示。我已将 Spark 存储库添加到我的pom.xml

<repository>
      <id>Spark repository</id>
      <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url>
</repository>

And then the dependency:

然后是依赖:

<dependency>
      <groupId>spark</groupId>
      <artifactId>spark</artifactId>
      <version>0.9.9.4-SNAPSHOT</version>
</dependency>

But I'm getting an error in Eclipse:

但是我在 Eclipse 中遇到错误:

Missing artifact spark:spark:jar:0.9.9.4-SNAPSHOT

How can I resolve this issue? I don't want to download Spark's jar file and place in the local repository.

我该如何解决这个问题?我不想下载 Spark 的 jar 文件并将其放在本地存储库中。

This is my pom.xml file:

这是我的 pom.xml 文件:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>com.myproject</groupId>
  <artifactId>Spark1</artifactId>
  <version>1.0-SNAPSHOT</version>
  <packaging>jar</packaging>

  <name>Spark1</name>
  <url>http://maven.apache.org</url>

  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
  </properties>

  <repository>
      <id>Spark repository</id>
      <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url>
  </repository>

  <dependencies>
<!--     (...) -->

    <dependency>
      <groupId>spark</groupId>
      <artifactId>spark</artifactId>
      <version>0.9.9.4-SNAPSHOT</version>
    </dependency>

  </dependencies>

</project>

采纳答案by André Stannek

The repositoryblock needs to be wrapped in a repositoriesblock:

repository块需要被包裹在一个repositories块:

<repositories>
    <repository>
        <id>Spark repository</id>
        <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url>
    </repository>
</repositories>

回答by Sumit Ramteke

Currently no repository is required to add for Spark library loading

目前不需要为 Spark 库加载添加存储库

You just need to add

你只需要添加

<dependency>
    <groupId>com.sparkjava</groupId>
    <artifactId>spark-core</artifactId>
    <version>2.6.0</version>
</dependency>

And that's it.

就是这样。

Useful tutorials to play with is here

有用的教程在这里

回答by Sankara

Reason for failure is 0.9.9.4-SNAPSHOT is not available.Below is the list of the snapshots available. Use one among them based on your requirement.

失败原因是 0.9.9.4-SNAPSHOT 不可用。以下是可用快照列表。根据您的要求使用其中之一。

0.9.8-SNAPSHOT/ Sat May 21 21:54:23 UTC 2011
0.9.9-SNAPSHOT/ Mon May 23 10:57:38 UTC 2011
0.9.9.1-SNAPSHOT/ Thu May 26 09:47:03 UTC 2011
0.9.9.3-SNAPSHOT/ Thu Sep 01 07:53:59 UTC 2011

0.9.8-SNAPSHOT /周六5月21日21时54分23秒UTC 2011
0.9.9-SNAPSHOT /周一5月23日10点57分38秒UTC 2011
0.9.9.1 -快照/周四5月26日9时47分03秒UTC 2011
0.9。 9.3-快照/ 2011 年 9 月 1 日星期四 07:53:59 UTC

Thanks, Sankara Reddy

谢谢,桑卡拉雷迪

回答by Jorgesys

The last versions (2.1 and later) of Spark only need the dependency defined inside the pom.xml file

Spark 的最新版本(2.1 及更高版本)只需要在 pom.xml 文件中定义的依赖项

<dependency>
    <groupId>com.sparkjava</groupId>
    <artifactId>spark-core</artifactId>
    <version>2.1</version>
</dependency>

the repository definition is not longer required

不再需要存储库定义

回答by spartan

use this latest repository. http://mvnrepository.com/artifact/org.apache.spark/spark-core_2.10/1.6.0

使用这个最新的存储库。 http://mvnrepository.com/artifact/org.apache.spark/spark-core_2.10/1.6.0

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>1.6.0</version>
</dependency>

回答by Apurva Singh

use this, and also make sure you change spark library to version 2.11.x in eclipse project build path

使用它,并确保在 eclipse 项目构建路径中将 spark 库更改为 2.11.x 版

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.spark-scala</groupId>
    <artifactId>spark-scala</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>${project.artifactId}</name>
    <description>Spark in Scala</description>
    <inceptionYear>2010</inceptionYear>

    <properties>
        <maven.compiler.source>1.8</maven.compiler.source>
        <maven.compiler.target>1.8</maven.compiler.target>
        <encoding>UTF-8</encoding>
        <scala.tools.version>2.10</scala.tools.version>
        <!-- Put the Scala version of the cluster -->
        <scala.version>2.10.4</scala.version>
    </properties>

    <!-- repository to add org.apache.spark -->
    <repositories>
        <repository>
            <id>cloudera-repo-releases</id>
            <url>https://repository.cloudera.com/artifactory/repo/</url>
        </repository>
    </repositories>

    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <testSourceDirectory>src/test/scala</testSourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.2.1</version>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.13</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>

            <!-- "package" command plugin -->
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.4.1</version>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.scala-tools</groupId>
                <artifactId>maven-scala-plugin</artifactId>
            </plugin>
        </plugins>
    </build>

    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>1.2.1</version>
        </dependency>
    </dependencies>
</project>

回答by ptsering

I had run into same issue because initially I started with different repository url for spark and then to use earlier version I changed the repository url. Some how it didn't seem to come into effect until I changed the repository id. Try changing the repository id.
Could be bug in maven because running maven from console also couldn't resolve the dependency without updating the id.

我遇到了同样的问题,因为最初我从不同的存储库 url 开始用于 spark,然后使用早期版本我更改了存储库 url。在我更改存储库 ID 之前,它似乎没有生效。尝试更改存储库 ID。
可能是 maven 中的错误,因为从控制台运行 maven 也无法在不更新 id 的情况下解决依赖关系。

回答by Bravo

please add the repository , tag inside the repositories tag like below

请添加存储库,在存储库标签中添加标签,如下所示

<repositories>
        <repository>
            <id>Spark repository</id>
            <url>http://www.sparkjava.com/nexus/content/repositories/spark/</url>
        </repository>
    </repositories>