Error:scalac: error while loading package, class file 'C:\Program Files (x86)\scala\lib\scala-library.jar(scala/collection/immutable/package.class)' is broken
(class java.lang.RuntimeException/error reading Scala signature of package.class: Scala signature package has wrong version
**expected: 5.0**
found: 5.2 in package.class)
程序内容
object ActionDemo {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setAppName("ActionDemo").setMaster("spark://192.168.137.15:7077")
val context = new SparkContext(conf)
val rdd1:RDD[Int] = context.parallelize(List(1,2,3,4,5,6,4))
println(rdd1.collect().toList)
println(rdd1.count())
val rdd2:Array[Int]=rdd1.top(3)
println(rdd2.toBuffer)
val rdd3:Array[Int]=rdd1.take(3)
println(rdd3.toBuffer)
}
}
原因 spark-core版本需要和scala版本一致
因为scala版本是这样的
所以需要修改spark-core依赖,其他的依赖也需要修改为_2.12版本
org.apache.spark
spark-core_2.12
3.2.1
完整maven
1.8
1.8
UTF-8
2.12.15
3.2.1
2.7.1
8.0.13
org.scala-lang
scala-library
${scala.version}
org.apache.spark
spark-core_2.12
${spark.version}
org.apache.hadoop
hadoop-client
${hadoop.version}
mysql
mysql-connector-java
${mysql.version}
org.apache.spark
spark-streaming-kafka-0-10_2.12
${spark.version}
org.apache.spark
spark-streaming_2.12
${spark.version}
org.apache.spark
spark-hive_2.12
${spark.version}
org.apache.spark
spark-mllib_2.12
${spark.version}
org.apache.spark
spark-hive-thriftserver_2.12
${spark.version}
org.apache.spark
spark-sql_2.12
${spark.version}
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)