[root@ambari1 Test]# curl -X POST --data '{"file": "/home/Test/spark-examples_2.11-2.2.0.2.6.3.0-235.jar", "className": "org.apache.spark.examples.SparkPi","args":["100"]}' -H "Content-Type: application/json" 192.168.xxx.xx3:8999/batches {"id":350,"state":"starting","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":["stdout: ","nstderr: ","nYARN Diagnostics: "]}
注意: "file"的value值为hdfs目录下的文件; livy2访问地址:192.168.xxx.xx3:8999 "agrs"的value值为要传入的参数;二、设置环境参数方式提交
curl -H "Content-Type: application/json" 192.168.xxx.xx3:8999/batches -X POST --data '{ "name":"Livy2Test", "className":"org.apache.spark.examples.SparkPi", "file":"/home/Test/spark-examples_2.11-2.2.0.2.6.3.0-235.jar", "driverMemory" : "5G", "executorMemory" : "10G", "executorCores" : 5, "numExecutors" : 10, "queue" : "default" }'三、设置scopt命令行参数向livy2提交
scopt命令行参数提交参考
1、原命令提交spark-submit --master yarn --deploy-mode cluster --driver-cores 2 --driver-memory 8g --executor-cores 4 --num-executors 10 --executor-memory 8g --name pronmae --class com.gl.main.TestMain /home/Test/proAnalyse-1.0.0.jar --day 20210705 --hour 15 --rhivetable table01 --whivetable table02 --csvPath csvpath --preNHour 3 --useDatabases default --onlyDay 02、向livy2提交命令
curl -H "Content-Type: application/json" 192.168.xxx.xx3:8999/batches -X POST --data '{ "name":"LivyREST", "className":"com.gl.main.TestMain", "file":"/home/Test/proAnalyse-1.0.0.jar", "driverMemory" : "5G", "executorMemory" : "10G", "executorCores" : 5, "numExecutors" : 10, "queue" : "default", "args" : ["--day:20210705","--hour:15","--rhivetable:table01","--whivetable:table02","--csvPath:csvpath","--preNHour:3","--useDatabases:default","--onlyDay:0"] }'
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)