本地运行spark出现问题:
22/04/26 20:11:42 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:221)
该问题出现原因是因为 JVM申请的memory不够导致无法启动SparkContext。
解决方案:加上 .set(“spark.testing.memory”,“2147480000”)
val conf = new SparkConf().setMaster("local[*]").setAppName("test").set("spark.testing.memory","2147480000")
val sc = new SparkContext(conf)
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)