windows32位搭建hive

windows32位搭建hive,第1张

windows32位搭建hive

****如果最后搭建不成功说明没将环境切换至mysql,仍然是derby;在配置文件中已特别注明

准备工作(可不做任何修改hive也能运行,默认的配置元数据是存放在Derby数据库里面的,大多数人都不怎么熟悉,我们得改用mysql来存储我们的元数据,以及修改数据存放位置和日志存放位置等使得我们必须配置自己的环境)

下载apache-hive-2.1.1-bin.tar.gz(下载网址Index of /dist/hive)

安装mysql

安装hadoop(参考一份简单明了的Hadoop搭建基于windows32位)

  1. 将apache-hive-2.1.1-bin.tar.gz解压出来
  2. 设置环境变量(此电脑/属性/高级属性设置/环境变量)
  3. 系统变量添加HADOOP_HOME=安装目录(例:D:学习视频apache-hive-2.1.1-bin);PATH中添加D:学习视频apache-hive-2.1.1-binbin
  4. D:学习视频apache-hive-2.1.1-binconf有4个默认的配置文件hive-default.xml.template、hive-env.sh.template、hive-exec-log4j.properties.template、hive-log4j.properties.template将四个文件复制粘贴一下(防止出错)并将后缀.template去掉并将hive-default.xml重命名为hive-site.xml;
  5. 在hive安装目录中创建hive文件,在hive下创建scratch_dir、resources_dir、querylog_dir、operation_logs_dir;在hadoop上创建hdfs目录

6.1在cmd中切换到D盘: d:

6.2 cd d:/Hadoop-2.6.2/bin

6.3 hadoop fs -mkdir /user;Hadoop fs -mkdir /user/hive;Hadoop fs -mkdir /user/hive/warehouse;Hadoop fs -mkdir /tmp;Hadoop fs -mkdir /tmp/hive

配置D:学习视频apache-hive-2.1.1-binconfhive-site.xml

hive.metastore.warehouse.dir

/user/hive/warehouse

location of default database for the warehouse

hive.exec.scratchdir

/tmp/hive

HDFS root scratch dir for Hive jobs which gets created with write all (733) permission. For each connecting user, an HDFS scratch dir: ${hive.exec.scratchdir}/ is created, with ${hive.scratch.dir.permission}.

hive.exec.local.scratchdir

D:学习视频apache-hive-2.1.1-binhivescratch_dir

Local scratch space for Hive jobs

hive.downloaded.resources.dir

D:学习视频apache-hive-2.1.1-binhiveresources_dir

Temporary local directory for added resources in the remote file system.

hive.querylog.location

D:学习视频apache-hive-2.1.1-binhivequerylog_dir

Location of Hive run time structured log file

hive.server2.logging.operation.log.location

D:学习视频apache-hive-2.1.1-binhiveoperation_logs_dir

Top level directory where operation logs are stored if logging functionality is enabled

在最后添加

javax.jdo.option.ConnectionURL

jdbc:mysql://localhost:3306/hive?characterEncoding=UTF-8

javax.jdo.option.ConnectionDriverName

com.mysql.jdbc.Driver

javax.jdo.option.ConnectionUserName

root

javax.jdo.option.ConnectionPassword

123456

datanucleus.autoCreateSchema

true

datanucleus.autoCreateTables

true

datanucleus.autoCreateColumns

true

如果不会加我会将hive-site.xml附加,供大家参考

8、修改D:学习视频apache-hive-2.1.1-binconfhive-env.sh

# Set HADOOP_HOME to point to a specific hadoop install directory

HADOOP_HOME=D:hadoop-2.6.2

# Hive Configuration Directory can be controlled by:

export HIVE_CONF_DIR=D:学习视频apache-hive-2.1.1-binconf

# Folder containing extra ibraries required for hive compilation/execution can be controlled by:

export HIVE_AUX_JARS_PATH=D:学习视频apache-hive-2.1.1-binlib

  1. 连接数据库的jar包已经附加

9、MySQL设置

(1)创建hive数据库: create database hive default character set latin1;

(2)grant all on hive.* to hive@'localhost'  identified by 'hive'; 

 flush privileges;

10、启动服务
(1)启动hadoop:start-all.cmd
(2)启动metastore服务:hive --service metastore

11、查看mysql数据库
use hive;

show tables;

当出现下方内容说明初始化成功


(3)启动Hive:hive


 若Hive成功启动,Hive本地模式安装完成。

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5635968.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-16
下一篇 2022-12-16

发表评论

登录后才能评论

评论列表(0条)

保存