hive 执行分区表数据插入语句时 Error running child : java.lang.OutOfMemoryError: Java heap space

hive 执行分区表数据插入语句时 Error running child : java.lang.OutOfMemoryError: Java heap space,第1张

hive 执行分区表数据插入语句时 Error running child : java.lang.OutOfMemoryError: Java heap space

2022-01-09 09:52:46,117 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: Java heap space
at java.nio.HeapByteBuffer.(HeapByteBuffer.java:57)
at java.nio.ByteBuffer.allocate(ByteBuffer.java:335)
at org.apache.hadoop.hive.ql.io.orc.OutStream.getNewInputBuffer(OutStream.java:107)
at org.apache.hadoop.hive.ql.io.orc.OutStream.spill(OutStream.java:223)
at org.apache.hadoop.hive.ql.io.orc.OutStream.flush(OutStream.java:239)
at org.apache.hadoop.hive.ql.io.orc.WriterImpl T r e e W r i t e r . w r i t e S t r i p e ( W r i t e r I m p l . j a v a : 725 ) a t o r g . a p a c h e . h a d o o p . h i v e . q l . i o . o r c . W r i t e r I m p l TreeWriter.writeStripe(WriterImpl.java:725) at org.apache.hadoop.hive.ql.io.orc.WriterImpl TreeWriter.writeStripe(WriterImpl.java:725)atorg.apache.hadoop.hive.ql.io.orc.WriterImplIntegerTreeWriter.writeStripe(WriterImpl.java:937)
at org.apache.hadoop.hive.ql.io.orc.WriterImpl S t r u c t T r e e W r i t e r . w r i t e S t r i p e ( W r i t e r I m p l . j a v a : 1611 ) a t o r g . a p a c h e . h a d o o p . h i v e . q l . i o . o r c . W r i t e r I m p l . f l u s h S t r i p e ( W r i t e r I m p l . j a v a : 1991 ) a t o r g . a p a c h e . h a d o o p . h i v e . q l . i o . o r c . W r i t e r I m p l . c l o s e ( W r i t e r I m p l . j a v a : 2283 ) a t o r g . a p a c h e . h a d o o p . h i v e . q l . i o . o r c . O r c O u t p u t F o r m a t StructTreeWriter.writeStripe(WriterImpl.java:1611) at org.apache.hadoop.hive.ql.io.orc.WriterImpl.flushStripe(WriterImpl.java:1991) at org.apache.hadoop.hive.ql.io.orc.WriterImpl.close(WriterImpl.java:2283) at org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat StructTreeWriter.writeStripe(WriterImpl.java:1611)atorg.apache.hadoop.hive.ql.io.orc.WriterImpl.flushStripe(WriterImpl.java:1991)atorg.apache.hadoop.hive.ql.io.orc.WriterImpl.close(WriterImpl.java:2283)atorg.apache.hadoop.hive.ql.io.orc.OrcOutputFormatOrcRecordWriter.close(OrcOutputFormat.java:106)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator$FSPaths.closeWriters(FileSinkOperator.java:188)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:980)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:598)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:199)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:459)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

在网上尝试了很多方法,修改了很多配置,都没有起作用,最终在这里找到解决办法:
https://blog.csdn.net/weixin_33724046/article/details/86130412
https://community.hortonworks.com/questions/37603/i-am-getting-outofmemory-while-inserting-the-data.html

我的hive表数据类型是OCRFile类型,似乎是这个类型对于分区有限制,将该表的数据类型修改成text,问题解决。

create table log_text (
track_time string,
url string,
session_id string,
referer string,
ip string,
end_user_id string,
city_id string
)
stored as orc tblproperties (“orc.compress”=“SNAPPY”) ;

修改后
create table log_text (
track_time string,
url string,
session_id string,
referer string,
ip string,
end_user_id string,
city_id string
)
row format delimited fields terminated by ‘t’
stored as textfile ;

欢迎分享,转载请注明来源:内存溢出

原文地址: https://outofmemory.cn/zaji/5706677.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-17
下一篇 2022-12-17

发表评论

登录后才能评论

评论列表(0条)

保存