执行报错日志为:
Failed to monitor Job[-1] with exception ‘java.lang.IllegalStateException(Connection to remote Spark driver was lost)’ Last known state = SENT
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Unable to send message SyncJobRequest{job=org.apache.hadoop.hive.ql.exec.spark.status.impl.RemoteSparkJobStatus$GetAppIDJob@7805478c} because the Remote Spark Driver - HiveServer2 connection has been closed.
导致此问题的真正原因需要去Yarn看Application的详细日志:
原来是配的executor-memory太小了,需要在hive配置页面修改
修改后保存并重启相关组件,问题解决
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)