Hive On Spark报错:Remote Spark Driver - HiveServer2 connection has been closed

Hive On Spark报错:Remote Spark Driver - HiveServer2 connection has been closed,第1张

Hive On Spark报错:Remote Spark Driver - HiveServer2 connection has been closed

执行报错日志为:
Failed to monitor Job[-1] with exception ‘java.lang.IllegalStateException(Connection to remote Spark driver was lost)’ Last known state = SENT
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Unable to send message SyncJobRequest{job=org.apache.hadoop.hive.ql.exec.spark.status.impl.RemoteSparkJobStatus$GetAppIDJob@7805478c} because the Remote Spark Driver - HiveServer2 connection has been closed.

导致此问题的真正原因需要去Yarn看Application的详细日志:

原来是配的executor-memory太小了,需要在hive配置页面修改

修改后保存并重启相关组件,问题解决

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5716993.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-17
下一篇 2022-12-17

发表评论

登录后才能评论

评论列表(0条)

保存