java hadoop类路径

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/14492243/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-31 16:26:44  来源:igfitidea点击:

hadoop class path

javahadoop

提问by user1982993

I am trying to run some unit tests for mapper and reducer using junit and mockito.
Do I have to specify the location of the hadoop core and commons jar files in classpath everytime I run a test?.
I thought running the "hadoop" command was supposed to automatically include all the required libraries at runtime. Is there anyway to avoid typing the hadoop dependencies everytime?.

我正在尝试使用 junit 和 mockito 为 mapper 和 reducer 运行一些单元测试。
每次运行测试时,我是否都必须在类路径中指定 hadoop core 和 commons jar 文件的位置?。
我认为运行“hadoop”命令应该在运行时自动包含所有必需的库。有没有办法避免每次都输入 hadoop 依赖项?

hadoop -cp /home/xxx/Downloads/mockito-all-1.9.5.jar:/home/xxx/Downloads/junit-4.10.jar:/home/xxx/Downloads/hadoop-1.1.1/hadoop-core-1.1.1.jar:./classes:.:/home/xxx/Downloads/hadoop-1.1.1/lib/commons-logging-1.1.1.jar org.junit.runner.JUnitCore MaxTemperatureMapperTest 

回答by user1687035

You can specify the classpath in hadoop-env.sh. "export HADOOP_CLASSPATH=*" The next time you run hadoop, the classpath will be automatically added.

您可以在 hadoop-env.sh 中指定类路径。"export HADOOP_CLASSPATH= *" 下次运行 hadoop 时,会自动添加类路径。