Hadoop环境搭建(八)--Hive的搭建一

Hadoop环境搭建(八)--Hive的搭建一,第1张

Hadoop环境搭建(八)--Hive的搭建一

一、Hive搭建方法一

此方法需在windows安装Mysql数据库

  1. Hive搭建

1)将apache-hive-3.1.2-bin.tar.gz压缩包放在/opt/software路径

2)、[root@hadoop100 ~]# cd /opt/software

3)、解压apache-hive-3.1.2-bin.tar.gz到/opt/module/路径下

[root@hadoop100 software]# tar -zxvf apache-hive-3.1.2-bin.tar.gz -C /opt/module/

4)、进入解压之后的apache-hive-3.1.2-bin路径下的conf/文件夹下

[root@hadoop100 module]# cd /opt/module/apache-hive-3.1.2-bin/conf/

5)、创建并编辑hive-site.xml文件,输入以下内容:

注意:下列标红部分为按自己VMnet8的IP及数据库账号密码进行配置

[root@hadoop100 conf]# vi hive-site.xml

                           

                                                       

                                                                                   

                                                          

                                                                                      

    javax.jdo.option.ConnectionURL                                                        

                                                                                        jdbc:mysql://192.168.1.5:3306/hive?useSSL=false&characterEncoding=utf8&a                            mp;serverTimezone=UTC                                                                                    

                                                                                      

                                                          

 

                                                                                                          

    javax.jdo.option.ConnectionDriverName                                                    

    com.mysql.cj.jdbc.Driver                                                    

                                                                                                          

                                                      

                                                                                                          

    javax.jdo.option.ConnectionUserName                                                    

    root                                                                                                        

                                                                                                          

                                                                                                          

                                                                                                          

    javax.jdo.option.ConnectionPassword                                                                                                        

    jyl010212..                                                                                                        

                                                                                                          

                                                                                                          

                                                                                                          

    hive.metastore.schema.verification                                                    

    false                                                                                                        

                                                                                                          

                                                                                                          

                                                                                                          

    hive.metastore.event.db.notification.api.auth                                                    

    false                                                                                                        

                                                                                                          

                                                                                                          

                                                                                                          

    hive.metastore.uris                                                                                                        

    thrift://localhost:9083                                                                                                        

                                                                                                          

                                                                                                          

                                                                                                          

    hive.server2.thrift.bind.host                                                                                                        

    localhost                                                                                                        

                                                                                                          

                                                                                                          

                                                                                                          

    hive.server2.thrift.port                                                    

    10000                                                                                                        

                                                                                                          

                                                                                                          

    hive.cli.print.header                                                                                                        

    true                                                                                                        

                                                                                                                                                                                                                  

                                                                                                          

    hive.cli.print.current.db                                                                                                        

    true                                                                                                 

                                                                                               

                                                                                  

6)、将mysql的jar包放在/opt/module/apache-hive-3.1.2-bin/lib路径下

 

7)、查看lib目录下mysql文件是否齐全

[root@hadoop100 conf]# ls /opt/module/apache-hive-3.1.2-bin/lib/ | grep mysql

 

8)、将/opt/module/hadoop-3.1.3/share/hadoop/common/lib/guava-27.0-jre.jar剪切到/opt/module/apache-hive-3.1.2-bin/lib路径下

[root@hadoop100 apache-hive-3.1.2-bin]# cd /opt/module/apache-hive-3.1.2-bin/

[root@hadoop100 apache-hive-3.1.2-bin]# mv /opt/module/hadoop-3.1.3/share/hadoop/common/lib/guava-27.0-jre.jar ./lib/

出现如下提示直接按回车即可

9)、初始化mysql

[root@hadoop100 apache-hive-3.1.2-bin]# bin/schematool -dbType mysql -initSchema -verbose

10)、后台启动metastore及hiveserver2服务(若想脚本启动,可看按第三节进行配置)

[root@hadoop100 apache-hive-3.1.2-bin]# nohup bin/hive --service metastore &

[root@hadoop100 apache-hive-3.1.2-bin]# nohup bin/hive --service hiveserver2 &

11)、启动hive

[root@hadoop100 apache-hive-3.1.2-bin]# hive

看到如下界面,hive正常启动

(which: no hbase只是hbase没有装而已。对hive不影响,不是错误)

which: no hbase in (/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/module/jdk1.8.0_212/bin:/opt/module/hadoop-3.1.3/bin:/opt/module/hadoop-3.1.3/sbin:/opt/module/hive/bin:/home/soft863/.local/bin:/home/soft863/bin)

Hive Session ID = 36f90830-2d91-469d-8823-9ee62b6d0c26

Logging initialized using configuration in jar:file:/opt/module/hive/lib/hive-common-3.1.2.jar!/hive-log4j2.properties Async: true

Hive Session ID = 14f96e4e-7009-4926-bb62-035be9178b02

hive>

2、Hive环境变量配置如下:

1)修改profile文件

[root@hadoop100 apache-hive-3.1.2-bin]# vi /etc/profile

添加内容如下:

HIVE_HOME=/opt/module/apache-hive-3.1.2-bin

修改内容如下:

PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$HIVE_HOME/bin

export PATH JAVA_HOME HADOOP_HOME HIVE_HOME

  1. 刷新/etc/profile

3)[root@hadoop100 apache-hive-3.1.2-bin]# source /etc/profile

3、编写脚本后台启动metastore和hiveserver2
  1. 为了方便使用,可以直接编写脚本来管理服务的启动和关闭

[soft863@hadoop102 hive]$ vim $HIVE_HOME/bin/hiveservices.sh

内容如下:此脚本的编写不要求掌握。直接拿来使用即可。

#!/bin/bash

HIVE_LOG_DIR=$HIVE_HOME/logs

if [ ! -d $HIVE_LOG_DIR ]

then

mkdir -p $HIVE_LOG_DIR

fi

#检查进程是否运行正常,参数1为进程名,参数2为进程端口

function check_process()

{

    pid=$(ps -ef 2>/dev/null | grep -v grep | grep -i $1 | awk '{print $2}')

    ppid=$(netstat -nltp 2>/dev/null | grep $2 | awk '{print $7}' | cut -d '/' -f 1)

    echo $pid

    [[ "$pid" =~ "$ppid" ]] && [ "$ppid" ] && return 0 || return 1

}

function hive_start()

{

    metapid=$(check_process Hivemetastore 9083)

    cmd="nohup hive --service metastore >$HIVE_LOG_DIR/metastore.log 2>&1 &"

    cmd=$cmd" sleep 4; hdfs dfsadmin -safemode wait >/dev/null 2>&1"

    [ -z "$metapid" ] && eval $cmd || echo "metastroe服务已启动"

    server2pid=$(check_process HiveServer2 10000)

    cmd="nohup hive --service hiveserver2 >$HIVE_LOG_DIR/hiveServer2.log 2>&1 &"

    [ -z "$server2pid" ] && eval $cmd || echo "HiveServer2服务已启动"

}

function hive_stop()

{

    metapid=$(check_process Hivemetastore 9083)

    [ "$metapid" ] && kill $metapid || echo "metastore服务未启动"

    server2pid=$(check_process HiveServer2 10000)

    [ "$server2pid" ] && kill $server2pid || echo "HiveServer2服务未启动"

}

case $1 in

"start")

    hive_start

    ;;

"stop")

    hive_stop

    ;;

"restart")

    hive_stop

    sleep 2

    hive_start

    ;;

"status")

    check_process Hivemetastore 9083 >/dev/null && echo "metastore服务运行正常" || echo "metastore服务运行异常"

    check_process HiveServer2 10000 >/dev/null && echo "HiveServer2服务运行正常" || echo "HiveServer2服务运行异常"

    ;;

*)

    echo Invalid Args!

    echo 'Usage: '$(basename $0)' start|stop|restart|status'

    ;;

esac

2)添加执行权限

[soft863@hadoop102 hive]$ chmod +x $HIVE_HOME/bin/hiveservices.sh

3)启动Hive后台服务

[soft863@hadoop102 hive]$ hiveservices.sh start

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5665413.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-16
下一篇 2022-12-16

发表评论

登录后才能评论

评论列表(0条)

保存