Spark任务,通过输入参数配置灵活配置任务运行时间,但是,在一套新代码重报错
Exception in thread "main" java.lang.NoClassDefFoundError: scala/Product$class at scopt.OptionParser.二、问题原因(options.scala:175) at com.common.RichOptionParser. (RichOptionParser.scala:6) at com.task.utils.Parser$$anon$1. (Parser.scala:16) at com.task.utils.Parser$.getOpt(Parser.scala:16) at com.task.test.AppInviteMergeSpark$.main(AppInviteMergeSpark.scala:95) at com.task.test.AppInviteMergeSpark.main(AppInviteMergeSpark.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: scala.Product$class at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) .. 18 more
因为采用scopt依赖去解析输入的参数,但是,实际scopt引入的是2.11依赖,而代码重使用的是scala 2.12.版本号导致报错这种奇怪的错误。替换其依赖修改为2.12即可
com.github.scopt scopt_2.113.4.0
修改为
com.github.scopt scopt_2.123.5.0
这种神奇的问题,大多都是因为版本号不对应。
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)