Spark解决 System memory 259522560 must be at least 471859200

Spark解决 System memory 259522560 must be at least 471859200,第1张

本地运行spark出现问题:

22/04/26 20:11:42 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
	at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:221)

该问题出现原因是因为 JVM申请的memory不够导致无法启动SparkContext。
解决方案:加上 .set(“spark.testing.memory”,“2147480000”)

 val conf = new SparkConf().setMaster("local[*]").setAppName("test").set("spark.testing.memory","2147480000")
    val sc = new SparkContext(conf)

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/langs/790373.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-05-05
下一篇 2022-05-05

发表评论

登录后才能评论

评论列表(0条)

保存