Hadoop 2.6.0 的 Eclipse 插件
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/28494727/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Eclipse plugin for Hadoop 2.6.0
提问by user3202144
I want to write MapReduce jobs in Java. For that I have installed Hadoop 2.6.0 in Ubuntu 14.04 LTS VM. The installation directory is /usr/local/hadoop
.
我想用 Java 编写 MapReduce 作业。为此,我在 Ubuntu 14.04 LTS VM 中安装了 Hadoop 2.6.0。安装目录为/usr/local/hadoop
.
Now according to many tutorials, they find an Eclipse plugin in /contrib
directory and paste it in /Eclipse/plugins
folder.
现在根据很多教程,他们在/contrib
目录中找到一个Eclipse插件并将其粘贴到/Eclipse/plugins
文件夹中。
The problem is that there is no src/contrib
folder nor any Eclipse plugin bundeled with Hadoop 2.6.0. So how do I configure Eclipse Europa to run Hadoop MapReduce jobs? If that's not possible, what are the alternatives of writing MapReduce jobs?
问题是没有src/contrib
文件夹,也没有任何与 Hadoop 2.6.0 捆绑的 Eclipse 插件。那么如何配置 Eclipse Europa 来运行 Hadoop MapReduce 作业呢?如果这是不可能的,那么编写 MapReduce 作业的替代方法是什么?
回答by sabari
Integrating Hadoop-2.6.0 with eclipse
将 Hadoop-2.6.0 与 Eclipse 集成
- User “hdfs” is created where all Hadoop processes are running.
- Hadoop is installed to the directory “/opt/hadoop“.
- Eclipse is installed to the directory “/opt/eclipse“.
- 用户“hdfs”是在所有 Hadoop 进程运行的地方创建的。
- Hadoop 安装到目录“/opt/hadoop”。
- Eclipse 安装到目录“/opt/eclipse”。
Step 1: Download the hadoop-eclipse-plugin 2.6.0 jar
第一步:下载hadoop-eclipse-plugin 2.6.0 jar
Step 2: Copy the Map-Reduce plugin for eclipse in the the plugins directory of your eclipse folder – sudo cp /home/hdfs/Downloads/hadoop-eclipse-plugin-2.6.0.jar /opt/eclipse/plugins/ Restart the eclipse using the command – /opt/eclipse/eclipse -vm /usr/local/jdk1.8.0_05/bin/java -vmargs -Xmx1024m If elcipse is not coming up because of the X11 forwarding issue, try using “sux” instead of “su” while switching to the “hdfs“. Step 3: Start the eclipse? 1. $ECLIPSE_HOME/eclipse
第二步:在你的eclipse文件夹的plugins目录下复制eclipse的Map-Reduce插件——sudo cp /home/hdfs/Downloads/hadoop-eclipse-plugin-2.6.0.jar /opt/eclipse/plugins/ 重启eclipse 使用命令 – /opt/eclipse/eclipse -vm /usr/local/jdk1.8.0_05/bin/java -vmargs -Xmx1024m 如果由于 X11 转发问题 elcipse 没有出现,请尝试使用“sux”而不是“su”同时切换到“hdfs”。第 3 步:开始日食?1. $ECLIPSE_HOME/eclipse
step 4: In Eclipse menu click, ?Window --> Open Perspective --> Others -->??MapReduce
第 4 步:在 Eclipse 菜单中单击,?Window --> Open Perspective --> Others -->??MapReduce
step 5: In bottom MapReduce icon click to Add new Hadoop location
步骤 5:在底部 MapReduce 图标中单击以添加新的 Hadoop 位置
step 6: Enter MapReduce & HDFS running port For recall, MapReduce port (9001) specified in $HADOOP_HOME/conf/mapred-site.xml? For recall, HDFS port (9000) specified in $HADOOP_HOME/conf/core-site.xml Enter the Hadoop user name
步骤6:进入MapReduce & HDFS运行端口 回想一下,$HADOOP_HOME/conf/mapred-site.xml中指定的MapReduce端口(9001)?回想一下,$HADOOP_HOME/conf/core-site.xml 中指定的 HDFS 端口 (9000) 输入 Hadoop 用户名
step 7 : Once Hadoop location added, DFS Locations will be seen/displayed in Eclipse Project Explorer window, (Windows-->Show View-->Project Explorer)
第 7 步:添加 Hadoop 位置后,将在 Eclipse 项目资源管理器窗口中看到/显示 DFS 位置,(Windows-->显示视图-->项目资源管理器)
step 8: Once Hadoop added, DFS Locations will be seen/displayed in Project Explorer window,
第 8 步:添加 Hadoop 后,将在项目资源管理器窗口中看到/显示 DFS 位置,
step 9: Right click DFS location and click to Connect
步骤 9:右键单击 DFS 位置并单击以连接
step 10 : Once connected successfully, it will display all the DFS Folder.
第 10 步:连接成功后,将显示所有 DFS 文件夹。
Step 11: You can create Directory, Upload files to HDFS location, Download files to local by right click any of the listed Directory.
第 11 步:您可以创建目录、将文件上传到 HDFS 位置、通过右键单击列出的任何目录将文件下载到本地。
回答by mosa
Download and build this project https://github.com/winghc/hadoop2x-eclipse-plugin
下载并构建这个项目 https://github.com/winghc/hadoop2x-eclipse-plugin
after download follow these steps:
下载后按照以下步骤操作:
$ cd src/contrib/eclipse-plugin
$ cd src/contrib/eclipse-plugin
Assume hadoop installation directory is /usr/share/hadoop
假设hadoop安装目录是/usr/share/hadoop
$ ant jar -Dversion=2.4.1 -Dhadoop.version=2.4.1 -Declipse.home=/opt/eclipse -Dhadoop.home=/usr/share/hadoop
$ ant jar -Dversion=2.4.1 -Dhadoop.version=2.4.1 -Declipse.home=/opt/eclipse -Dhadoop.home=/usr/share/hadoop
final jar will be generated at directory
最终 jar 将在目录中生成
${hadoop2x-eclipse-plugin}/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-2.6.0.jar
${hadoop2x-eclipse-plugin}/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-2.6.0.jar