Error[8]: Undefined offset: 2, File: /www/wwwroot/outofmemory.cn/tmp/plugin_ss_superseo_model_superseo.php, Line: 121
File: /www/wwwroot/outofmemory.cn/tmp/plugin_ss_superseo_model_superseo.php, Line: 473, decode(

hive报return code 2 错误

在使用dataphin 的工具时候,出现了管道任务执行成功,而到执行CDM层的逻辑表的时候执行失败。通常来说,管道任务执行成功后,逻辑表补数不因该出现问题才对。
问题情况:
1、通过集成管道任务进行数据同步,管道任务执行成功,没报任何错误。在查询整个表的数据时报错,报“Bad status for request TFetchResultsReq…”错误,但是查询部分字段是可以查询的。具体日志如下:

Bad status for request TFetchResultsReq(fetchType=0, operationHandle=TOperationHandle(hasResultSet=True, modifiedRowCount=None, operationType=0, operationId=THandleIdentifier(secret='x8exd4xc5rx8dLARx91zxeexe5zRx99xf9', guid='xe0xb3xfax00xa2OI$xa4)mx81@DLxfe')), orientation=4, maxRows=100): TFetchResultsResp(status=TStatus(errorCode=0, errorMessage='java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable', sqlState=None, infoMessages=['*org.apache.hive.service.cli.HiveSQLException:java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable:14:13', 'org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:463', 'org.apache.hive.service.cli.operation.OperationManager:getOperationNextRowSet:OperationManager.java:294', 'org.apache.hive.service.cli.session.HiveSessionImpl:fetchResults:HiveSessionImpl.java:769', 'org.apache.hive.service.cli.CLIService:fetchResults:CLIService.java:462', 'org.apache.hive.service.cli.thrift.ThriftCLIService:FetchResults:ThriftCLIService.java:696', 'org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults:getResult:TCLIService.java:1553', 'org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults:getResult:TCLIService.java:1538', 'org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39', 'org.apache.thrift.TbaseProcessor:process:TbaseProcessor.java:39', 'org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor:process:HadoopThriftAuthBridge.java:747', 'org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:286', 'java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1149', 'java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:624', 'java.lang.Thread:run:Thread.java:748', '*java.io.IOException:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable:16:2', 'org.apache.hadoop.hive.ql.exec.FetchTask:fetch:FetchTask.java:154', 'org.apache.hadoop.hive.ql.Driver:getResults:Driver.java:2071', 'org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:458', '*org.apache.hadoop.hive.ql.metadata.HiveException:java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable:25:9', 'org.apache.hadoop.hive.ql.exec.ListSinkOperator:processOp:ListSinkOperator.java:90', 'org.apache.hadoop.hive.ql.exec.Operator:forward:Operator.java:815', 'org.apache.hadoop.hive.ql.exec.LimitOperator:processOp:LimitOperator.java:51', 'org.apache.hadoop.hive.ql.exec.Operator:forward:Operator.java:815', 'org.apache.hadoop.hive.ql.exec.SelectOperator:processOp:SelectOperator.java:84', 'org.apache.hadoop.hive.ql.exec.Operator:forward:Operator.java:815', 'org.apache.hadoop.hive.ql.exec.TableScanOperator:processOp:TableScanOperator.java:98', 'org.apache.hadoop.hive.ql.exec.FetchOperator:pushRow:FetchOperator.java:425', 'org.apache.hadoop.hive.ql.exec.FetchOperator:pushRow:FetchOperator.java:417', 'org.apache.hadoop.hive.ql.exec.FetchTask:fetch:FetchTask.java:140', '*java.lang.ClassCastException:org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable:30:5',  'org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector:getPrimitiveJavaObject:WritableHiveDecimalObjectInspector.java:49', 'org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector:getPrimitiveJavaObject:WritableHiveDecimalObjectInspector.java:26', 'org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorUtils:copyToStandardObject:ObjectInspectorUtils.java:336', 'org.apache.hadoop.hive.serde2.SerDeUtils:toThriftPayload:SerDeUtils.java:167', 'org.apache.hadoop.hive.ql.exec.FetchFormatter$ThriftFormatter:convert:FetchFormatter.java:61', 'org.apache.hadoop.hive.ql.exec.ListSinkOperator:processOp:ListSinkOperator.java:87'], statusCode=3), results=None, hasMoreRows=None)

2、dataphin 的逻辑表补数报错 retrun code 2

Completed executing command(queryId=hive_20211129142222_d3bed0ae-57da-4373-9408-1a12788ddcd9); Time taken: 62.728 secondsTask failed : java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
	at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:284)
	at com.alibaba.dt.oneservice.sdk.task.baseJdbcOsTask.runTask(baseJdbcOsTask.java:118)
	at com.alibaba.dt.oneservice.sdk.task.baseOsTask.run(baseOsTask.java:111)
	at com.alibaba.dt.oneservice.sdk.job.OsJob.run(OsJob.java:185)
	at com.alibaba.dt.oneservice.sdk.executor.baseOsExecutor.lambda$run[+++](baseOsExecutor.java:42)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1152)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:627)
	at java.lang.Thread.run(Thread.java:882)
2021-11-29 14:23:57.686 Job failed```

经一番查询后,源头在ODS层的物理表,是表中存在日期时间不是合法的导致下游CDM层逻辑表补数失败。

解决方案:将日期字段类型有timestamp 改成 string ,程序就跑成功了。

)
File: /www/wwwroot/outofmemory.cn/tmp/route_read.php, Line: 126, InsideLink()
File: /www/wwwroot/outofmemory.cn/tmp/index.inc.php, Line: 165, include(/www/wwwroot/outofmemory.cn/tmp/route_read.php)
File: /www/wwwroot/outofmemory.cn/index.php, Line: 30, include(/www/wwwroot/outofmemory.cn/tmp/index.inc.php)
hive报return code 2 错误_随笔_内存溢出

hive报return code 2 错误

hive报return code 2 错误,第1张

hive报return code 2 错误

在使用dataphin 的工具时候,出现了管道任务执行成功,而到执行CDM层的逻辑表的时候执行失败。通常来说,管道任务执行成功后,逻辑表补数不因该出现问题才对。
问题情况:
1、通过集成管道任务进行数据同步,管道任务执行成功,没报任何错误。在查询整个表的数据时报错,报“Bad status for request TFetchResultsReq…”错误,但是查询部分字段是可以查询的。具体日志如下:

Bad status for request TFetchResultsReq(fetchType=0, operationHandle=TOperationHandle(hasResultSet=True, modifiedRowCount=None, operationType=0, operationId=THandleIdentifier(secret='x8exd4xc5rx8dLARx91zxeexe5zRx99xf9', guid='xe0xb3xfax00xa2OI$xa4)mx81@DLxfe')), orientation=4, maxRows=100): TFetchResultsResp(status=TStatus(errorCode=0, errorMessage='java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable', sqlState=None, infoMessages=['*org.apache.hive.service.cli.HiveSQLException:java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable:14:13', 'org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:463', 'org.apache.hive.service.cli.operation.OperationManager:getOperationNextRowSet:OperationManager.java:294', 'org.apache.hive.service.cli.session.HiveSessionImpl:fetchResults:HiveSessionImpl.java:769', 'org.apache.hive.service.cli.CLIService:fetchResults:CLIService.java:462', 'org.apache.hive.service.cli.thrift.ThriftCLIService:FetchResults:ThriftCLIService.java:696', 'org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults:getResult:TCLIService.java:1553', 'org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults:getResult:TCLIService.java:1538', 'org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39', 'org.apache.thrift.TbaseProcessor:process:TbaseProcessor.java:39', 'org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor:process:HadoopThriftAuthBridge.java:747', 'org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:286', 'java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1149', 'java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:624', 'java.lang.Thread:run:Thread.java:748', '*java.io.IOException:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable:16:2', 'org.apache.hadoop.hive.ql.exec.FetchTask:fetch:FetchTask.java:154', 'org.apache.hadoop.hive.ql.Driver:getResults:Driver.java:2071', 'org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:458', '*org.apache.hadoop.hive.ql.metadata.HiveException:java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable:25:9', 'org.apache.hadoop.hive.ql.exec.ListSinkOperator:processOp:ListSinkOperator.java:90', 'org.apache.hadoop.hive.ql.exec.Operator:forward:Operator.java:815', 'org.apache.hadoop.hive.ql.exec.LimitOperator:processOp:LimitOperator.java:51', 'org.apache.hadoop.hive.ql.exec.Operator:forward:Operator.java:815', 'org.apache.hadoop.hive.ql.exec.SelectOperator:processOp:SelectOperator.java:84', 'org.apache.hadoop.hive.ql.exec.Operator:forward:Operator.java:815', 'org.apache.hadoop.hive.ql.exec.TableScanOperator:processOp:TableScanOperator.java:98', 'org.apache.hadoop.hive.ql.exec.FetchOperator:pushRow:FetchOperator.java:425', 'org.apache.hadoop.hive.ql.exec.FetchOperator:pushRow:FetchOperator.java:417', 'org.apache.hadoop.hive.ql.exec.FetchTask:fetch:FetchTask.java:140', '*java.lang.ClassCastException:org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable:30:5',  'org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector:getPrimitiveJavaObject:WritableHiveDecimalObjectInspector.java:49', 'org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector:getPrimitiveJavaObject:WritableHiveDecimalObjectInspector.java:26', 'org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorUtils:copyToStandardObject:ObjectInspectorUtils.java:336', 'org.apache.hadoop.hive.serde2.SerDeUtils:toThriftPayload:SerDeUtils.java:167', 'org.apache.hadoop.hive.ql.exec.FetchFormatter$ThriftFormatter:convert:FetchFormatter.java:61', 'org.apache.hadoop.hive.ql.exec.ListSinkOperator:processOp:ListSinkOperator.java:87'], statusCode=3), results=None, hasMoreRows=None)

2、dataphin 的逻辑表补数报错 retrun code 2

Completed executing command(queryId=hive_20211129142222_d3bed0ae-57da-4373-9408-1a12788ddcd9); Time taken: 62.728 secondsTask failed : java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
	at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:284)
	at com.alibaba.dt.oneservice.sdk.task.baseJdbcOsTask.runTask(baseJdbcOsTask.java:118)
	at com.alibaba.dt.oneservice.sdk.task.baseOsTask.run(baseOsTask.java:111)
	at com.alibaba.dt.oneservice.sdk.job.OsJob.run(OsJob.java:185)
	at com.alibaba.dt.oneservice.sdk.executor.baseOsExecutor.lambda$run(baseOsExecutor.java:42)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1152)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:627)
	at java.lang.Thread.run(Thread.java:882)
2021-11-29 14:23:57.686 Job failed```

经一番查询后,源头在ODS层的物理表,是表中存在日期时间不是合法的导致下游CDM层逻辑表补数失败。

解决方案:将日期字段类型有timestamp 改成 string ,程序就跑成功了。

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5665416.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-16
下一篇 2022-12-16

发表评论

登录后才能评论

评论列表(0条)

保存