org.apache.spark.SparkException: A master URL must be set in your configuration at org.apache.spark.SparkContext.(SparkContext.scala:376) at com.ck.data.batch.customer.CustomerDateCountJob$.main(CustomerDateCountJob.scala:26) at com.ck.data.batch.customer.CustomerDateCountJob.main(CustomerDateCountJob.scala) 2021-12-30 16:09:01,873 ERROR Utils - Uncaught exception in thread main java.lang.NullPointerException at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$postApplicationEnd(SparkContext.scala:2591) at org.apache.spark.SparkContext$$anonfun$stop$1.apply$mcV$sp(SparkContext.scala:2099) at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1531) at org.apache.spark.SparkContext.stop(SparkContext.scala:2098) at org.apache.spark.SparkContext. (SparkContext.scala:612) at com.ck.data.batch.customer.CustomerDateCountJob$.main(CustomerDateCountJob.scala:26) at com.ck.data.batch.customer.CustomerDateCountJob.main(CustomerDateCountJob.scala) 2021-12-30 16:09:01,881 INFO SparkContext - Successfully stopped SparkContext Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration at org.apache.spark.SparkContext. (SparkContext.scala:376) at com.ck.data.batch.customer.CustomerDateCountJob$.main(CustomerDateCountJob.scala:26) at com.ck.data.batch.customer.CustomerDateCountJob.main(CustomerDateCountJob.scala)
主要是master没有设置
.setMaster("local")
def main(args: Array[String]): Unit = { val sparkConf = new SparkConf().setAppName("name").setMaster("local") val sc = new SparkContext(sparkConf) process(sc) sc.stop() }
由于又要提交上线每次又必须删除,所以决定在idea的opt配置一个全局的,不用每次添加删除
配置完之后删除已有的Application,如下截图中序号2上面的那个CustomerDateCountJob(每个人的都不同)
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)