java 如何在Hadoop中增加Java堆大小

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/34727008/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-11-02 23:14:30  来源:igfitidea点击:

how to increase java heap size in Hadoop

javahiveyarnhadoop2bigdata

提问by shaik mahammed

I am using Hadoop 2.6.0 versionand trying to run Hive insert into table where i got the JAVA Heap error.

我正在使用Hadoop 2.6.0 版本并尝试将 Hive 插入到表中,在那里我遇到了 JAVA 堆错误。

Is there any way I can increase the heap size in hadoop through out the cluster?

有什么方法可以在整个集群中增加 hadoop 中的堆大小吗?

Thanks in advance

提前致谢

回答by m.aibin

For that you may execute the following before executing hadoop command:

为此,您可以在执行 hadoop 命令之前执行以下操作:

export HADOOP_HEAPSIZE=4096

Alternatively, you can achieve the same thing by adding the following permanent setting in your mapred-site.xml file, this file lies in HADOOP_HOME/conf/ :

或者,您可以通过在 mapred-site.xml 文件中添加以下永久设置来实现相同的目的,该文件位于 HADOOP_HOME/conf/ :

<property>
    <name>mapred.child.java.opts</name>
    <value>-Xmx4096m</value>
</property>

Look also here: https://www.mapr.com/blog/how-to-avoid-java-heap-space-errors-understanding-and-managing-task-attempt-memory

也看看这里:https: //www.mapr.com/blog/how-to-avoid-java-heap-space-errors-understanding-and-managing-task-attempt-memory

回答by Yusuf Hassan

There'll be instances where any such exportstatements will be overwritten.

在某些情况下,任何此类export语句都将被覆盖。

Whatever value I assigned to any such variable from command line, it used to pick up the same old property defined in the environment file.

无论我从命令行分配给任何此类变量的值是什么,它都用于获取环境文件中定义的相同旧属性。

For me to make it work, I had to edit the statement: export HADOOP_HEAPSIZE="[size in MB]"in the file hadoop-env.sh

为了让它工作,我必须编辑语句: export HADOOP_HEAPSIZE="[size in MB]"在文件中hadoop-env.sh

However, remember that it is akin to hardcoding and its effect will reflect globally, unless is overridden.

但是,请记住,它类似于硬编码,其效果将在全局范围内反映,除非被覆盖。