内存不足异常org.apache.spark网站.memory.MemoryConsumer.allocatePag

2024-04-20 06:44:44 发布

您现在位置:Python中文网/ 问答频道 /正文

我有一份python spark的工作。它在一个非常小的数据集(小于8kb)上运行。但是,当它运行时失败,并出现以下错误:

2017-02-10 10:06:58,402 ERROR [stdout writer for python] util.Utils (Logging.scala:logError(95)) - Uncaught exception in thread stdout writer for python java.lang.OutOfMemoryError: Unable to acquire 172 bytes of memory, got 0 at org.apache.spark.memory.MemoryConsumer.allocatePage(MemoryConsumer.java:120) at org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.acquireNewPageIfNecessary(UnsafeExternalSorter.java:321) at org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.insertRecord(UnsafeExternalSorter.java:336) at org.apache.spark.sql.execution.UnsafeExternalRowSorter.insertRow(UnsafeExternalRowSorter.java:91) at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:168) at org.apache.spark.sql.execution.Sort$$anonfun$1.apply(Sort.scala:90) at org.apache.spark.sql.execution.Sort$$anonfun$1.apply(Sort.scala:64)

知道发生什么事了吗?我在谷歌上搜索,但没有找到任何对我有帮助的东西。我确实看到了很多关于这个的帖子与一个spark内存泄漏有关,但是它们似乎已经在spark中修复了(我有spark 1.6)。在

感谢任何帮助或建议。在


Tags: orgsqlapacheutilstdoutjavasortat