java 用于 hadoop mapreduce 的罐子

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/31853723/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-11-02 19:15:18  来源:igfitidea点击:

Jars for hadoop mapreduce

javahadoopjarmapreduce

提问by Kaushik Lele

I am following this hadoop mapreduce tutorialgiven by Apache. The Java code given there uses these Apache-hadoop classes:

我正在关注Apache 提供的这个 hadoop mapreduce 教程。那里给出的 Java 代码使用这些 Apache-hadoop 类:

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

But I could not understand where to download these Jars from. On searching internet for these classes I could see they are available here.

但我不明白从哪里下载这些罐子。在互联网上搜索这些课程时,我可以看到它们在这里可用。

But what is the formal/authentic Apache repository for these and Jars?

但是这些和 Jars 的正式/真实的 Apache 存储库是什么?

If jars are shipped along with hadoop, please let me know the path.

如果 jars 与 hadoop 一起提供,请告诉我路径。

EDIT : Other question does not give clear instructions. I found answer as follows

编辑:其他问题没有给出明确的说明。我找到了如下答案

This tutorialmentions:

本教程提到:

Download Hadoop-core-1.2.1.jar, which is used to compile and execute the MapReduce program. Visit the following link http://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core/1.2.1to download the jar.

下载Hadoop-core-1.2.1.jar,用于编译和执行MapReduce程序。访问以下链接http://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core/1.2.1下载 jar。

So this looks authentic repository.

所以这看起来是真实的存储库。

回答by Kaushik Lele

This tutorialmentions :

本教程提到:

Download Hadoop-core-1.2.1.jar, which is used to compile and execute the MapReduce program. Visit the following link http://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core/1.2.1to download the jar.

下载Hadoop-core-1.2.1.jar,用于编译和执行MapReduce程序。访问以下链接http://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core/1.2.1下载 jar。

So hereyou can find all the jars for different versions

所以在这里你可以找到不同版本的所有罐子

回答by Madhusoodan P

The best way is download Hadoop (3.x.y) And include the below jars from hadoop-3.x.y/share/hadoop/mapreduce

最好的方法是下载 Hadoop (3.xy) 并包含以下 jars hadoop-3.x.y/share/hadoop/mapreduce

1. hadoop-common-3.x.y.jar 2. hadoop-mapreduce-client-core-3.x.y.jar

1. hadoop-common-3.x.y.jar 2. hadoop-mapreduce-client-core-3.x.y.jar

That worked for me!

那对我有用!

回答by subtleseeker

Try compiling using:
javac -cp $(hadoop classpath) MapRTest.java.
In most cases, the files are already present with the downloaded hadoop. For more info, look into this.

尝试编译使用:
javac -cp $(hadoop classpath) MapRTest.java
在大多数情况下,这些文件已经与下载的 hadoop 一起存在。有关更多信息,请查看

回答by San4musa

javac -cp /usr/hdp/2.6.2.0-205/hadoop-mapreduce/:/usr/hdp/2.6.2.0-205/hadoop/:. MyTest.java

javac -cp /usr/hdp/2.6.2.0-205/hadoop-mapreduce/ :/usr/hdp/2.6.2.0-205/hadoop/:. 我的测试程序

Worked for me in CloudxLab.

在 CloudxLab 对我来说有效。

回答by Reda

The tutorial you are following uses Hadoop 1.0. Which means the jars that you have and the ones that the tutorial is using is different. If you are using Hadoop 2.X, follow a tutorial that makes use of exactly that version. You don't need to download jars from a third party, you just need to know the proper use of the API of that specific hadoop version.

您接下来的教程使用 Hadoop 1.0。这意味着您拥有的 jar 和教程使用的 jar 是不同的。如果您使用的是 Hadoop 2.X,请按照使用该版本的教程进行操作。您不需要从第三方下载 jars,您只需要知道该特定 hadoop 版本的 API 的正确使用。

回答by Dan Ciborowski - MSFT

Using NetBeans I create a new Maven project.

我使用 NetBeans 创建了一个新的 Maven 项目。

Then under project files, I open the pom.xml.

然后在项目文件下,我打开 pom.xml。

I add inside of

我在里面添加

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-core</artifactId>
        <version>0.20.2</version>
    </dependency> 

After building with dependencies I am now ready to code.

构建依赖项后,我现在准备编码。

回答by Lost Carrier

With current version 2.7.1, I was stumbling at Missing artifact org.apache.hadoop:hadoop-mapreduce:jar:2.7.1, but found out that this jar appears to be split up into various smaller ones.

在当前的 2.7.1 版本中,我遇到了Missing artifact org.apache.hadoop:hadoop-mapreduce:jar:2.7.1,但发现这个 jar 似乎被分成了各种更小的 jar。

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.7.1</version>
</dependency>

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-common</artifactId>
    <version>2.7.1</version>
</dependency>

...worked for me (...no clue what this is meant for: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-mapreduce/2.7.1/)

...对我来说有效(...不知道这是什么意思:https: //repo1.maven.org/maven2/org/apache/hadoop/hadoop-mapreduce/2.7.1/

回答by Priyanka Yemul

If you get such type of error then just type the command on terminal:

如果您遇到此类错误,则只需在终端上键入命令:

export HADOOP_HOME=$(hadoop classath)

export HADOOP_HOME=$(hadoop classath)

note:You have to check for your own hadoop configured name in ./bashrc file. At the time of hadoop installation we set the Hadoop and java path in .bashrc file. We have to Check here in below we can see that next to export .

注意:您必须在 ./bashrc 文件中检查您自己的 hadoop 配置名称。在安装 hadoop 时,我们在 .bashrc 文件中设置了 Hadoop 和 java 路径。我们必须在下面检查这里,我们可以在 export 旁边看到。