spark实战问题(一):is running beyond physical memory limits. Current usage: xx GB of xx GB physical memory

spark实战问题(一):is running beyond physical memory limits. Current usage: xx GB of xx GB physical memory,第1张

spark实战问题(一):is running beyond physical memory limits. Current usage: xx GB of xx GB physical memory 一:背景

Spark 任务出现了container内存负载出现OOM

二:问题

    

Application application_xxx_xxxx failed 2 times due to AM Container for appattempt_xxxx_xxxx_xxxx exited with exitCode: -104
Failing this attempt.Diagnostics: Container [pid=78835,containerID=container_e14_1611819047508_2623322_02_000003] is running beyond physical memory limits. Current usage: 6.6 GB of 6.6 GB physical memory used; 11.9 GB of 32.3 TB virtual memory used. Killing container.
Dump of the process-tree for container_e14_1611819047508_2623322_02_000003 :
|- PID PPID PGRPID SESSID CMD_NAME USER_MODE_TIME(MILLIS) SYSTEM_TIME(MILLIS) VMEM_USAGE(BYTES) RSSMEM_USAGE(PAGES) FULL_CMD_LINE

三、分析

1、Spark OOM主要出现方:Drive和Executor。

        Driver:主要是collect、show等

       Executor:主要是缓存RDD等

2、查看UI

   

4、内存简介

     参考Spark (一):Executor内存_在前进的路上-CSDN博客一、背景 Spark是基于内存的分布式计算引擎,我们需对Executor内存管理的详细了解,方便我们遇到OOM解决问题、或者优化时更好调优,,Spark任务启动时有两个进程,分别为Driver、Executor进程,Driver进程(内存默认1G)可能在本地启动,也可能在集群中某个工作节点上启动(根据提交模式client、Cluster等)。Driver启动会申请资源(Executor),根据配置启动对应Executor数量,每个Executor进程都会对应一定数量的内存和CPU COR...https://blog.csdn.net/congcong68/article/details/122274441

3、调整对应参数

       executor-memory
       num-executors
        driver-memory

 

    

欢迎分享,转载请注明来源:内存溢出

原文地址: https://outofmemory.cn/zaji/5695828.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-17
下一篇 2022-12-17

发表评论

登录后才能评论

评论列表(0条)

保存