86.通过Livy的RESTful API接口向CDH集群提交作业

86.通过Livy的RESTful API接口向CDH集群提交作业,第1张

86.通过Livy的RESTful API接口向CDH集群提交作业 86.1 演示环境介绍
  • Livy版本为:0.4
  • CM和CDH版本为:5.13.1
  • 集群未启用Kerberos
86.2 *** 作演示
  • 将作业运行的jar包上传到HDFS目录
  • 使用Maven创建Livy示例工程
  • pom文件中添加如下依赖

    org.apache.httpcomponents
    httpclient
    4.5.4

1.编写示例代码

  • HTTP请求的工具类(HttpUtils.java)
package com.cloudera.utils;
import org.apache.http.HttpEntity;
import org.apache.http.HttpResponse;
import org.apache.http.client.methods.HttpDelete;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.entity.StringEntity;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClients;
import org.apache.http.util.EntityUtils;
import java.io.IOException;
import java.util.Map;

public class HttpUtils {
    
    public static String getAccess(String url, Map headers) {
        String result = null;
        CloseableHttpClient httpClient = HttpClients.createDefault();
        HttpGet httpGet = new HttpGet(url);
        if(headers != null && headers.size() > 0){
            headers.forEach((K,V)->httpGet.addHeader(K,V));
        }
        try {
            HttpResponse response = httpClient.execute(httpGet);
            HttpEntity entity = response.getEntity();
            result = EntityUtils.toString(entity);
            System.out.println(result);
        } catch (IOException e) {
            e.printStackTrace();
        }
        return result;
    }
    
    public  static String deleteAccess(String url, Map headers) {
        String result = null;
        CloseableHttpClient httpClient = HttpClients.createDefault();
        HttpDelete httpDelete = new HttpDelete(url);
        if(headers != null && headers.size() > 0){
            headers.forEach((K,V)->httpDelete.addHeader(K,V));
        }
        try {
            HttpResponse response = httpClient.execute(httpDelete);
            HttpEntity entity = response.getEntity();
            result = EntityUtils.toString(entity);
            System.out.println(result);
        } catch (IOException e) {
            e.printStackTrace();
        }
        return result;
    }
    
    public static String postAccess(String url, Map headers, String data)  {
        String result = null;
        CloseableHttpClient httpClient = HttpClients.createDefault();
        HttpPost post = new HttpPost(url);
        if(headers != null && headers.size() > 0){
            headers.forEach((K,V)->post.addHeader(K,V));
        }
        try {
            StringEntity entity = new StringEntity(data);
            entity.setContentEncoding("UTF-8");
            entity.setContentType("application/json");
            post.setEntity(entity);
            HttpResponse response = httpClient.execute(post);
            HttpEntity resultEntity = response.getEntity();
            result = EntityUtils.toString(resultEntity);
            System.out.println(result);
            return result;
        } catch (Exception e) {
            e.printStackTrace();
        }
        return result;
    }
}
  • Livy RESTful API调用示例代码
package com.cloudera.nokerberos;
import com.cloudera.utils.HttpUtils;
import java.util.HashMap;

public class AppLivy {
    private static String LIVY_HOST = "http://ip-186-31-7-186.fayson.com:8998";
    public static void main(String[] args) {
        HashMap headers = new HashMap<>();
        headers.put("Content-Type", "application/json");
        headers.put("Accept", "application/json");
        headers.put("X-Requested-By", "fayson");
        //创建一个交互式会话
//        String kindJson = "{"kind": "spark", "proxyUser":"fayson"}";
//        HttpUtils.postAccess(LIVY_HOST + "/sessions", headers, kindJson);
        //执行code
//        String code = "{"code":"sc.parallelize(1 to 2).count()"}";
//        HttpUtils.postAccess(LIVY_HOST + "/sessions/1/statements", headers, code);
        //删除会话
//        HttpUtils.deleteAccess(LIVY_HOST + "/sessions/2", headers);
        //封装提交Spark作业的JSON数据
        String submitJob = "{"className": "org.apache.spark.examples.SparkPi","executorMemory": "1g","args": [200],"file": "/fayson-yarn/jars/spark-examples-1.6.0-cdh5.13.1-hadoop2.6.0-cdh5.13.1.jar", "proxyUser":"fayson"}";
        //向集群提交Spark作业
        HttpUtils.postAccess(LIVY_HOST + "/batches", headers, submitJob);
        //通过提交作业返回的SessionID获取具体作业的执行状态及APPID
        HttpUtils.getAccess(LIVY_HOST + "/batches/3", headers);
    }
}

2.代码运行

  • 运行AppLivy代码,向集群提交Spark作业,响应结果:
{
  "id": 4,
  "state": "starting",
  "appId": null,
  "appInfo": {
    "driverLogUrl": null,
    "sparkUiUrl": null
  },
  "log": ["stdout: ", "nstderr: ", "nYARN Diagnostics: "]
}
  • 获取作业运行状态,将上一步获取到的id传入到如下请求
  • 响应结果:
    • 通过如上返回的结果,我们可以看到作业的APPID
{
  "id": 4,
  "state": "success",
  "appId": "application_1518401384543_0008",
  "appInfo": {
    "driverLogUrl": null,
    "sparkUiUrl": "http://ip-186-31-6-148.fayson.com:8088/proxy/application_1518401384543_0008/"
  },
  "log": ["stdout: ", "WARNING: User-defined SPARK_HOME (/opt/cloudera/parcels/CDH-5.13.1-1.cdh5.13.1.p0.2/lib/spark) overrides detected (/opt/cloudera/parcels/CDH/lib/spark).", "WARNING: Running spark-class from user-defined location.", "nstderr: ", "nYARN Diagnostics: "]
}
  • 查看Livy界面提交作业的状态
    • 通过CM和Yarn的8088界面查看作业执行结果

大数据视频推荐:
CSDN
大数据语音推荐:
企业级大数据技术应用
大数据机器学习案例之推荐系统
自然语言处理
大数据基础
人工智能:深度学习入门到精通

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5687943.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-17
下一篇 2022-12-17

发表评论

登录后才能评论

评论列表(0条)

保存