升级Spark 3.2的包冲突记录

升级Spark 3.2的包冲突记录,第1张

升级Spark 3.2的包冲突记录

1.java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.JsonMappingException.(Ljava/io/Closeable;Ljava/lang/String;)V

11:13:16.370 [Driver] ERROR org.apache.spark.deploy.yarn.ApplicationMaster - User class threw exception: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.JsonMappingException.(Ljava/io/Closeable;Ljava/lang/String;)V
java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.JsonMappingException.(Ljava/io/Closeable;Ljava/lang/String;)V
	at com.fasterxml.jackson.module.scala.JacksonModule.setupModule(JacksonModule.scala:61)
	at com.fasterxml.jackson.module.scala.JacksonModule.setupModule$(JacksonModule.scala:46)
	at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:17)
	at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:718)
	at org.apache.spark.util.JsonProtocol$.(JsonProtocol.scala:62)
	at org.apache.spark.util.JsonProtocol$.(JsonProtocol.scala)
	at org.apache.spark.scheduler.EventLoggingListener.initEventLog(EventLoggingListener.scala:89)
	at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:84)
	at org.apache.spark.SparkContext.(SparkContext.scala:610)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
	at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
	at scala.Option.getOrElse(Option.scala:189)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
	at com.hiido.server.service.impl.SparkSqlJob.executing(SparkSqlJob.java:56)
	at com.hiido.server.service.impl.SparkSqlJob.main(SparkSqlJob.java:47)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:737)

问题原因:hive-exec-2.3.9.jar把com.fasterxml.jackson.databind也编译进去了,并且依赖的版本是2.6.5的,对应的没有这个方法,这个方法在2.7之后的版本才出现。

    public JsonMappingException(Closeable processor, String msg) {
        super(msg);
        _processor = processor;
        if (processor instanceof JsonParser) {
            // 17-Aug-2015, tatu: Use of token location makes some sense from databinding,
            //   since actual parsing (current) location is typically only needed for low-level
            //   parsing exceptions.
            _location = ((JsonParser) processor).getTokenLocation();
        }
    }

解决方法:

修改hive 2.3.9的pom:

2.6.5
改成
2.10.0
重新编译hive 2.3.9中的ql模块(hive-exec-2.3.9.jar),完成。

2. java.lang.NoSuchFieldError: JAVA_9

问题原因:hive-exec-2.3.9.jar把org.apache.commons.lang3也编译进去了,低版本没有JAVA_9。

解决方法:

修改hive 2.3.9的pom:

    3.1
改成
    3.8.1

重新编译hive 2.3.9中的ql模块(hive-exec-2.3.9.jar),完成。

或者直接再hive/ql模块加上,强制配置版本号即可

  
    
      
        com.fasterxml.jackson.core
         jackson-annotations
        2.10.0
        compile
      
      
        com.fasterxml.jackson.core
         jackson-core
        2.10.0
        compile
      
      
        com.fasterxml.jackson.core
        jackson-databind
        2.10.0
        compile
      
    
  

3. Spark Accumulator AccumulatorV2替换

org.apache.spark.Accumulator在Spark 2.0.0之后就不用了,hive 2.3.9中的spark-client模块还是依赖的Spark 2.0.0,SparkCounter代码替换如下
package org.apache.hive.spark.counter;

import java.io.Serializable;

import org.apache.spark.util.LongAccumulator;
import org.apache.spark.api.java.JavaSparkContext;

public class SparkCounter implements Serializable {

  private String name;
  private String displayName;
  private LongAccumulator accumulator;

  // Values of accumulators can only be read on the SparkContext side. This field is used when
  // creating a snapshot to be sent to the RSC client.
  private long accumValue;

  public SparkCounter() {
    // For serialization.
  }

  private SparkCounter(
      String name,
      String displayName,
      long value) {
    this.name = name;
    this.displayName = displayName;
    this.accumValue = value;
  }

  public SparkCounter(
    String name,
    String displayName,
    String groupName,
    long initValue,
    JavaSparkContext sparkContext) {

    this.name = name;
    this.displayName = displayName;
    String accumulatorName = groupName + "_" + name;
    this.accumulator = sparkContext.sc().longAccumulator(accumulatorName);
    accumulator.add(initValue);
  }

  public long getValue() {
    if (accumulator != null) {
      return accumulator.value();
    } else {
      return accumValue;
    }
  }

  public void increment(long incr) {
    accumulator.add(incr);
  }

  public String getName() {
    return name;
  }

  public String getDisplayName() {
    return displayName;
  }

  public void setDisplayName(String displayName) {
    this.displayName = displayName;
  }

  SparkCounter snapshot() {
    return new SparkCounter(name, displayName, accumulator.value());
  }

}
AccumulatorV2

Accumulator 在spark2.0就过时了,2.0后使用AccumulatorV2

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5716214.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-17
下一篇 2022-12-17

发表评论

登录后才能评论

评论列表(0条)

保存