【Spark】通过livy2提交spark任务

【Spark】通过livy2提交spark任务,第1张

【Spark】通过livy2提交spark任务 一、向livy2提交spark自带例子
[root@ambari1 Test]# curl -X POST --data '{"file": "/home/Test/spark-examples_2.11-2.2.0.2.6.3.0-235.jar", "className": "org.apache.spark.examples.SparkPi","args":["100"]}' -H "Content-Type: application/json" 192.168.xxx.xx3:8999/batches
{"id":350,"state":"starting","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":["stdout: ","nstderr: ","nYARN Diagnostics: "]}
注意:
    "file"的value值为hdfs目录下的文件;
    livy2访问地址:192.168.xxx.xx3:8999
    "agrs"的value值为要传入的参数;
二、设置环境参数方式提交
curl -H "Content-Type: application/json" 192.168.xxx.xx3:8999/batches 
 -X POST --data '{
  "name":"Livy2Test",
  "className":"org.apache.spark.examples.SparkPi",
  "file":"/home/Test/spark-examples_2.11-2.2.0.2.6.3.0-235.jar",
  "driverMemory" : "5G",
  "executorMemory" : "10G",
  "executorCores" : 5,
  "numExecutors" : 10,
  "queue" : "default"
  }' 
三、设置scopt命令行参数向livy2提交

scopt命令行参数提交参考

1、原命令提交
spark-submit 
  --master yarn  
  --deploy-mode cluster 
  --driver-cores 2   
  --driver-memory 8g 
  --executor-cores 4 
  --num-executors 10 
  --executor-memory 8g 
  --name pronmae 
  --class com.gl.main.TestMain /home/Test/proAnalyse-1.0.0.jar 
  --day 20210705 
  --hour 15 
  --rhivetable table01
  --whivetable table02
  --csvPath csvpath 
  --preNHour 3 
  --useDatabases default 
  --onlyDay 0
2、向livy2提交命令
curl -H "Content-Type: application/json" 192.168.xxx.xx3:8999/batches 
 -X POST --data '{
  "name":"LivyREST",
  "className":"com.gl.main.TestMain",
  "file":"/home/Test/proAnalyse-1.0.0.jar",
  "driverMemory" : "5G",
  "executorMemory" : "10G",
  "executorCores" : 5,
  "numExecutors" : 10,
  "queue" : "default",
  "args" : ["--day:20210705","--hour:15","--rhivetable:table01","--whivetable:table02","--csvPath:csvpath","--preNHour:3","--useDatabases:default","--onlyDay:0"]
  }' 

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/4967771.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-11-13
下一篇 2022-11-13

发表评论

登录后才能评论

评论列表(0条)

保存