java spark应用程序java.lang.OutOfMemoryError:直接缓冲内存

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/34922931/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-11-02 23:32:46  来源:igfitidea点击:

spark application java.lang.OutOfMemoryError: Direct buffer memory

javaapache-sparkout-of-memory

提问by ssyue

  1. I'm using following run time spark configuration values
  1. 我正在使用以下运行时火花配置值

spark-submit --executor-memory 8G --spark.yarn.executor.memoryOverhead 2G

spark-submit --executor-memory 8G --spark.yarn.executor.memoryOverhead 2G

but it still raise following out of memory error:

但它仍然引发以下内存不足错误:

I have a pairRDD having 8362269460 lines and partition size is 128 .It raise this error when pairRDD.groupByKey.saveAsTextFile .Any clue?

我有一个 pairRDD 有 8362269460 行,分区大小是 128 。当 pairRDD.groupByKey.saveAsTextFile 时它会引发这个错误。有任何线索吗?

update: I add a filter,and now data lines is 2300000000.Running in spark shell,no error. My cluster: 19 datenode 1 namdnode

更新:我添加了一个过滤器,现在数据行是 2300000000。在 spark shell 中运行,没有错误。我的集群:19 datenode 1 namdnode

             Min Resources: <memory:150000, vCores:150>
             Max Resources: <memory:300000, vCores:300>

Thanks for your help.

谢谢你的帮助。

org.apache.spark.shuffle.FetchFailedException: java.lang.OutOfMemoryError: Direct buffer memory
  at org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:321)
  at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:306)
  at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:51)
  at scala.collection.Iterator$$anon.next(Iterator.scala:328)
  at scala.collection.Iterator$$anon.hasNext(Iterator.scala:371)
  at scala.collection.Iterator$$anon.hasNext(Iterator.scala:327)
  at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:32)
  at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
  at org.apache.spark.util.collection.ExternalAppendOnlyMap.insertAll(ExternalAppendOnlyMap.scala:132)
  at org.apache.spark.Aggregator.combineValuesByKey(Aggregator.scala:60)
  at org.apache.spark.shuffle.hash.HashShuffleReader.read(HashShuffleReader.scala:89)
  at org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:90)
  at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:297)
  at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
  at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:297)
  at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
  at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:297)
  at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
  at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
  at org.apache.spark.scheduler.Task.run(Task.scala:88)
  at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  at java.lang.Thread.run(Thread.java:745)
Caused by: io.netty.handler.codec.DecoderException:  Direct buffer memory
  at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:234)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
  at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
  at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
  at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
  at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
  at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
  at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
  at io.netty.util.concurrent.SingleThreadEventExecutor.run(SingleThreadEventExecutor.java:111)
  ... 1 more
Caused by: java.lang.OutOfMemoryError: Direct buffer memory
  at java.nio.Bits.reserveMemory(Bits.java:658)
  at java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:123)
  at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:306)
  at io.netty.buffer.PoolArena$DirectArena.newUnpooledChunk(PoolArena.java:651)
  at io.netty.buffer.PoolArena.allocateHuge(PoolArena.java:237)
  at io.netty.buffer.PoolArena.allocate(PoolArena.java:215)
  at io.netty.buffer.PoolArena.reallocate(PoolArena.java:358)
  at io.netty.buffer.PooledByteBuf.capacity(PooledByteBuf.java:121)
  at io.netty.buffer.AbstractByteBuf.ensureWritable(AbstractByteBuf.java:251)
  at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:849)
  at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:841)
  at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:831)
  at io.netty.handler.codec.ByteToMessageDecoder.cumulate(ByteToMessageDecoder.java:92)
  at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:228)
  ... 10 more
)
org.apache.spark.shuffle.FetchFailedException: java.lang.OutOfMemoryError: Direct buffer memory
  at org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:321)
  at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:306)
  at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:51)
  at scala.collection.Iterator$$anon.next(Iterator.scala:328)
  at scala.collection.Iterator$$anon.hasNext(Iterator.scala:371)
  at scala.collection.Iterator$$anon.hasNext(Iterator.scala:327)
  at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:32)
  at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
  at org.apache.spark.util.collection.ExternalAppendOnlyMap.insertAll(ExternalAppendOnlyMap.scala:132)
  at org.apache.spark.Aggregator.combineValuesByKey(Aggregator.scala:60)
  at org.apache.spark.shuffle.hash.HashShuffleReader.read(HashShuffleReader.scala:89)
  at org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:90)
  at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:297)
  at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
  at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:297)
  at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
  at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:297)
  at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
  at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
  at org.apache.spark.scheduler.Task.run(Task.scala:88)
  at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  at java.lang.Thread.run(Thread.java:745)
Caused by: io.netty.handler.codec.DecoderException:  Direct buffer memory
  at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:234)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
  at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
  at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
  at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
  at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
  at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
  at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
  at io.netty.util.concurrent.SingleThreadEventExecutor.run(SingleThreadEventExecutor.java:111)
  ... 1 more
Caused by: java.lang.OutOfMemoryError: Direct buffer memory
  at java.nio.Bits.reserveMemory(Bits.java:658)
  at java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:123)
  at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:306)
  at io.netty.buffer.PoolArena$DirectArena.newUnpooledChunk(PoolArena.java:651)
  at io.netty.buffer.PoolArena.allocateHuge(PoolArena.java:237)
  at io.netty.buffer.PoolArena.allocate(PoolArena.java:215)
  at io.netty.buffer.PoolArena.reallocate(PoolArena.java:358)
  at io.netty.buffer.PooledByteBuf.capacity(PooledByteBuf.java:121)
  at io.netty.buffer.AbstractByteBuf.ensureWritable(AbstractByteBuf.java:251)
  at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:849)
  at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:841)
  at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:831)
  at io.netty.handler.codec.ByteToMessageDecoder.cumulate(ByteToMessageDecoder.java:92)
  at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:228)
  ... 10 more
)

I'd like to know how to correctly configure Direct Memory size. Best regards

我想知道如何正确配置直接内存大小。最好的祝福

回答by Marek-A-

I do not know any details about spark app, but i find the memory configuration hereyou need to set -XX:MaxDirectMemorySizesimilar as any else JVM mem. setting (over -XX:) try to use spark.executor.extraJavaOptions

我不知道有关 spark 应用程序的任何详细信息,但我发现此处的内存配置 需要-XX:MaxDirectMemorySize与其他任何 JVM 内存设置类似。设置(超过 -XX:) 尝试使用spark.executor.extraJavaOptions

If you are using spark-submityou can use:

如果您正在使用,spark-submit您可以使用:

./bin/spark-submit --name "My app" ...
  --conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:MaxDirectMemorySize=512m" myApp.jar