参数之前的hive安装mysql的步骤:https://blog.csdn.net/weixin_44178366/article/details/120559203?spm=1001.2014.3001.5501
安装好mysql之后将mock的数据导入新建的gmall数据库
生成数据# 修改application.properties,先修改为重置为1 [atguigu@hadoop102 db_log]$ pwd /opt/module/db_log [atguigu@hadoop102 db_log]$ ll total 14780 -rw-r--r-- 1 atguigu atguigu 1601 Oct 19 23:08 application.properties -rw-r--r-- 1 atguigu atguigu 15127607 Oct 19 23:08 gmall2020-mock-db-2021-01-22.jar [atguigu@hadoop102 db_log]$
编写生成数据的脚本 db.sh
#!/bin/bash for i in hadoop102 do ssh $i 'cd /opt/module/db_log;java -jar gmall2020-mock-db-2021-01-22.jar 1>/dev/null 2>&1 &' done业务数据建模
sqoop安装可借助EZDML这款数据库设计工具,来辅助我们梳理复杂的业务表关系。
下载地址: http://www.ezdml.com/download_cn.html
#1.下载并解压 1)下载地址:http://mirrors.hust.edu.cn/apache/sqoop/1.4.6/ 2)上传安装包sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz到hadoop102的/opt/software路径中 3)解压sqoop安装包到指定目录,如: [atguigu@hadoop102 software]$ tar -zxf sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz -C /opt/module/ 4)解压sqoop安装包到指定目录,如: [atguigu@hadoop102 module]$ mv sqoop-1.4.6.bin__hadoop-2.0.4-alpha/ sqoop-1.4.6 #2.修改配置文件 1) 进入到/opt/module/sqoop-1.4.6/conf目录,重命名配置文件 [atguigu@hadoop102 conf]$ mv sqoop-env-template.sh sqoop-env.sh 2) 修改配置文件 [atguigu@hadoop102 conf]$ vim sqoop-env.sh 增加如下内容 export HADOOP_COMMON_HOME=/opt/module/hadoop-3.1.3 export HADOOP_MAPRED_HOME=/opt/module/hadoop-3.1.3 export HIVE_HOME=/opt/module/hive export ZOOKEEPER_HOME=/opt/module/zookeeper-3.5.7 export ZOOCFGDIR=/opt/module/zookeeper-3.5.7/conf # 添加sqoop的环境变量 #sqoop export SQOOP_HOME=/opt/module/sqoop-1.4.6 export PATH=$PATH:$SQOOP_HOME/bin #3.拷贝JDBC驱动 1)将mysql-connector-java-5.1.48.jar 上传到/opt/software路径 2)进入到/opt/software/路径,拷贝jdbc驱动到sqoop的lib目录下。 [atguigu@hadoop102 software]$ cp mysql-connector-java-5.1.48.jar /opt/module/sqoop-1.4.6/lib/ 2.3.4 验证Sqoop 我们可以通过某一个command来验证sqoop配置是否正确: [atguigu@hadoop102 sqoop]$ bin/sqoop help 出现一些Warning警告(警告信息已省略),并伴随着帮助命令的输出: Available commands: codegen Generate code to interact with database records create-hive-table import a table definition into Hive eval evaluate a SQL statement and display the results export Export an HDFS directory to a database table help List available commands import import a table from a database to HDFS import-all-tables import tables from a database to HDFS import-mainframe import datasets from a mainframe server to HDFS job Work with saved jobs list-databases List available databases on a server list-tables List available tables in a database merge Merge results of incremental imports metastore Run a standalone Sqoop metastore version Display version information 2.3.5 测试Sqoop是否能够成功连接数据库 [atguigu@hadoop102 sqoop]$ bin/sqoop list-databases --connect jdbc:mysql://hadoop102:3306/ --username root --password 123456 出现如下输出: information_schema metastore mysql oozie performance_schema测试mysq导出数据到hdfs
需求一:将user_info表中的id,login_name,nick_name字段的值导入到hdfs中,
只要求导入id范围在100-200的
# select id,login_name,nick_name from user_info where id >=100 and id <=200 sqoop import --connect jdbc:mysql://hadoop102:3306/gmall --username root --password 123456 --table user_info --columns id,login_name,nick_name --where "id >=100 and id <=200" --target-dir /testsqoop --delete-target-dir --num-mappers 2 --fields-terminated-by "t"
# 使用-- query sqoop import --connect jdbc:mysql://hadoop102:3306/gmall --username root --password 123456 --query "select id,login_name,nick_name from user_info where id >=100 and id <=200 and $CONDITIONS " --target-dir /testsqoop --delete-target-dir --num-mappers 2 --split-by id --fields-terminated-by "t"sqoop同步策略
数据同步策略的类型包括:全量同步、增量同步、新增及变化同步、特殊情况
- Ø 全量表:存储完整的数据。
- Ø 增量表:存储新增加的数据。
- Ø新增及变化表:存储新增加的数据和变化的数据。
- Ø 特殊表:只需要存储一次。
增量同步策略每日全量,就是每天存储一份完整的数据,作为一个分区
新增及变化策略每日增量,就是每天存储一份增量数据,作为一个分区。适用于数据量大的表
特殊策略每日新增及变化,就是存储创建时间和 *** 作时间,都是今天的数据。适用场景:表的数据量大,既会有新增,又会有变化
业务数据导入hdfs 分析表同步策略某些特殊的表,可不必遵循上述同步策略。
例如没变化的客观世界的数据(比如性别,地区,民族,政治成分,鞋子尺码)可以只存一份。
同步脚本分析 首日同步脚本 mysql_to_hdfs_init.sh在生产环境,个别小公司,为了简单处理,所有表全量导入。
中大型公司,由于数据量比较大,还是严格按照同步策略导入数据。
#! /bin/bash APP=gmall sqoop=/opt/module/sqoop-1.4.6/bin/sqoop # 调用该脚本,会传入一个表名,日期 # 如果是输入的日期按照取输入日期;如果没输入日期取当前时间的前一天,-n判断是否存在 # 注意:如果配置了lzo压缩的话一定要在hadoop配置lzo的压缩格式(详细见hadoop的安装配置) if [ -n "" ] ;then do_date= else echo "请传入日期参数" exit fi # 导数据同步函数:传入表名和sql import_data(){ $sqoop import --connect jdbc:mysql://hadoop102:3306/$APP --username root --password 123456 --target-dir /origin_data/$APP/db//$do_date --delete-target-dir --query " where 1=1 and $CONDITIONS" --num-mappers 1 --fields-terminated-by 't' --compress --compression-codec lzop --null-string '\N' --null-non-string '\N' hadoop jar /opt/module/hadoop-3.1.3/share/hadoop/common/hadoop-lzo-0.4.20.jar com.hadoop.compression.lzo.DistributedLzoIndexer /origin_data/$APP/db//$do_date } import_order_info(){ import_data order_info "select id, total_amount, order_status, user_id, payment_way, delivery_address, out_trade_no, create_time, operate_time, expire_time, tracking_no, province_id, activity_reduce_amount, coupon_reduce_amount, original_total_amount, feight_fee, feight_fee_reduce from order_info" } import_coupon_use(){ import_data coupon_use "select id, coupon_id, user_id, order_id, coupon_status, get_time, using_time, used_time, expire_time from coupon_use" } import_order_status_log(){ import_data order_status_log "select id, order_id, order_status, operate_time from order_status_log" } import_user_info(){ import_data "user_info" "select id, login_name, nick_name, name, phone_num, email, user_level, birthday, gender, create_time, operate_time from user_info" } import_order_detail(){ import_data order_detail "select id, order_id, sku_id, sku_name, order_price, sku_num, create_time, source_type, source_id, split_total_amount, split_activity_amount, split_coupon_amount from order_detail" } import_payment_info(){ import_data "payment_info" "select id, out_trade_no, order_id, user_id, payment_type, trade_no, total_amount, subject, payment_status, create_time, callback_time from payment_info" } import_comment_info(){ import_data comment_info "select id, user_id, sku_id, spu_id, order_id, appraise, create_time from comment_info" } import_order_refund_info(){ import_data order_refund_info "select id, user_id, order_id, sku_id, refund_type, refund_num, refund_amount, refund_reason_type, refund_status, create_time from order_refund_info" } import_sku_info(){ import_data sku_info "select id, spu_id, price, sku_name, sku_desc, weight, tm_id, category3_id, is_sale, create_time from sku_info" } import_base_category1(){ import_data "base_category1" "select id, name from base_category1" } import_base_category2(){ import_data "base_category2" "select id, name, category1_id from base_category2" } import_base_category3(){ import_data "base_category3" "select id, name, category2_id from base_category3" } import_base_province(){ import_data base_province "select id, name, region_id, area_code, iso_code, iso_3166_2 from base_province" } import_base_region(){ import_data base_region "select id, region_name from base_region" } import_base_trademark(){ import_data base_trademark "select id, tm_name from base_trademark" } import_spu_info(){ import_data spu_info "select id, spu_name, category3_id, tm_id from spu_info" } import_favor_info(){ import_data favor_info "select id, user_id, sku_id, spu_id, is_cancel, create_time, cancel_time from favor_info" } import_cart_info(){ import_data cart_info "select id, user_id, sku_id, cart_price, sku_num, sku_name, create_time, operate_time, is_ordered, order_time, source_type, source_id from cart_info" } import_coupon_info(){ import_data coupon_info "select id, coupon_name, coupon_type, condition_amount, condition_num, activity_id, benefit_amount, benefit_discount, create_time, range_type, limit_num, taken_count, start_time, end_time, operate_time, expire_time from coupon_info" } import_activity_info(){ import_data activity_info "select id, activity_name, activity_type, start_time, end_time, create_time from activity_info" } import_activity_rule(){ import_data activity_rule "select id, activity_id, activity_type, condition_amount, condition_num, benefit_amount, benefit_discount, benefit_level from activity_rule" } import_base_dic(){ import_data base_dic "select dic_code, dic_name, parent_code, create_time, operate_time from base_dic" } import_order_detail_activity(){ import_data order_detail_activity "select id, order_id, order_detail_id, activity_id, activity_rule_id, sku_id, create_time from order_detail_activity" } import_order_detail_coupon(){ import_data order_detail_coupon "select id, order_id, order_detail_id, coupon_id, coupon_use_id, sku_id, create_time from order_detail_coupon" } import_refund_payment(){ import_data refund_payment "select id, out_trade_no, order_id, sku_id, payment_type, trade_no, total_amount, subject, refund_status, create_time, callback_time from refund_payment" } import_sku_attr_value(){ import_data sku_attr_value "select id, attr_id, value_id, sku_id, attr_name, value_name from sku_attr_value" } import_sku_sale_attr_value(){ import_data sku_sale_attr_value "select id, sku_id, spu_id, sale_attr_value_id, sale_attr_id, sale_attr_name, sale_attr_value_name from sku_sale_attr_value" } case in "order_info") import_order_info ;; "base_category1") import_base_category1 ;; "base_category2") import_base_category2 ;; "base_category3") import_base_category3 ;; "order_detail") import_order_detail ;; "sku_info") import_sku_info ;; "user_info") import_user_info ;; "payment_info") import_payment_info ;; "base_province") import_base_province ;; "base_region") import_base_region ;; "base_trademark") import_base_trademark ;; "activity_info") import_activity_info ;; "cart_info") import_cart_info ;; "comment_info") import_comment_info ;; "coupon_info") import_coupon_info ;; "coupon_use") import_coupon_use ;; "favor_info") import_favor_info ;; "order_refund_info") import_order_refund_info ;; "order_status_log") import_order_status_log ;; "spu_info") import_spu_info ;; "activity_rule") import_activity_rule ;; "base_dic") import_base_dic ;; "order_detail_activity") import_order_detail_activity ;; "order_detail_coupon") import_order_detail_coupon ;; "refund_payment") import_refund_payment ;; "sku_attr_value") import_sku_attr_value ;; "sku_sale_attr_value") import_sku_sale_attr_value ;; "all") import_base_category1 import_base_category2 import_base_category3 import_order_info import_order_detail import_sku_info import_user_info import_payment_info import_base_region import_base_province import_base_trademark import_activity_info import_cart_info import_comment_info import_coupon_use import_coupon_info import_favor_info import_order_refund_info import_order_status_log import_spu_info import_activity_rule import_base_dic import_order_detail_activity import_order_detail_coupon import_refund_payment import_sku_attr_value import_sku_sale_attr_value ;; esac每日同步脚本mysql_to_hdfs.sh
#! /bin/bash APP=gmall sqoop=/opt/module/sqoop-1.4.6/bin/sqoop if [ -n "" ] ;then do_date= else do_date=`date -d '-1 day' +%F` fi import_data(){ $sqoop import --connect jdbc:mysql://hadoop102:3306/$APP --username root --password 123456 --target-dir /origin_data/$APP/db//$do_date --delete-target-dir --query " and $CONDITIONS" --num-mappers 1 --fields-terminated-by 't' --compress --compression-codec lzop --null-string '\N' --null-non-string '\N' hadoop jar /opt/module/hadoop-3.1.3/share/hadoop/common/hadoop-lzo-0.4.20.jar com.hadoop.compression.lzo.DistributedLzoIndexer /origin_data/$APP/db//$do_date } import_order_info(){ import_data order_info "select id, total_amount, order_status, user_id, payment_way, delivery_address, out_trade_no, create_time, operate_time, expire_time, tracking_no, province_id, activity_reduce_amount, coupon_reduce_amount, original_total_amount, feight_fee, feight_fee_reduce from order_info where (date_format(create_time,'%Y-%m-%d')='$do_date' or date_format(operate_time,'%Y-%m-%d')='$do_date')" } import_coupon_use(){ import_data coupon_use "select id, coupon_id, user_id, order_id, coupon_status, get_time, using_time, used_time, expire_time from coupon_use where (date_format(get_time,'%Y-%m-%d')='$do_date' or date_format(using_time,'%Y-%m-%d')='$do_date' or date_format(used_time,'%Y-%m-%d')='$do_date' or date_format(expire_time,'%Y-%m-%d')='$do_date')" } import_order_status_log(){ import_data order_status_log "select id, order_id, order_status, operate_time from order_status_log where date_format(operate_time,'%Y-%m-%d')='$do_date'" } import_user_info(){ import_data "user_info" "select id, login_name, nick_name, name, phone_num, email, user_level, birthday, gender, create_time, operate_time from user_info where (DATE_FORMAT(create_time,'%Y-%m-%d')='$do_date' or DATE_FORMAT(operate_time,'%Y-%m-%d')='$do_date')" } import_order_detail(){ import_data order_detail "select id, order_id, sku_id, sku_name, order_price, sku_num, create_time, source_type, source_id, split_total_amount, split_activity_amount, split_coupon_amount from order_detail where DATE_FORMAT(create_time,'%Y-%m-%d')='$do_date'" } import_payment_info(){ import_data "payment_info" "select id, out_trade_no, order_id, user_id, payment_type, trade_no, total_amount, subject, payment_status, create_time, callback_time from payment_info where (DATE_FORMAT(create_time,'%Y-%m-%d')='$do_date' or DATE_FORMAT(callback_time,'%Y-%m-%d')='$do_date')" } import_comment_info(){ import_data comment_info "select id, user_id, sku_id, spu_id, order_id, appraise, create_time from comment_info where date_format(create_time,'%Y-%m-%d')='$do_date'" } import_order_refund_info(){ import_data order_refund_info "select id, user_id, order_id, sku_id, refund_type, refund_num, refund_amount, refund_reason_type, refund_status, create_time from order_refund_info where date_format(create_time,'%Y-%m-%d')='$do_date'" } import_sku_info(){ import_data sku_info "select id, spu_id, price, sku_name, sku_desc, weight, tm_id, category3_id, is_sale, create_time from sku_info where 1=1" } import_base_category1(){ import_data "base_category1" "select id, name from base_category1 where 1=1" } import_base_category2(){ import_data "base_category2" "select id, name, category1_id from base_category2 where 1=1" } import_base_category3(){ import_data "base_category3" "select id, name, category2_id from base_category3 where 1=1" } import_base_province(){ import_data base_province "select id, name, region_id, area_code, iso_code, iso_3166_2 from base_province where 1=1" } import_base_region(){ import_data base_region "select id, region_name from base_region where 1=1" } import_base_trademark(){ import_data base_trademark "select id, tm_name from base_trademark where 1=1" } import_spu_info(){ import_data spu_info "select id, spu_name, category3_id, tm_id from spu_info where 1=1" } import_favor_info(){ import_data favor_info "select id, user_id, sku_id, spu_id, is_cancel, create_time, cancel_time from favor_info where 1=1" } import_cart_info(){ import_data cart_info "select id, user_id, sku_id, cart_price, sku_num, sku_name, create_time, operate_time, is_ordered, order_time, source_type, source_id from cart_info where 1=1" } import_coupon_info(){ import_data coupon_info "select id, coupon_name, coupon_type, condition_amount, condition_num, activity_id, benefit_amount, benefit_discount, create_time, range_type, limit_num, taken_count, start_time, end_time, operate_time, expire_time from coupon_info where 1=1" } import_activity_info(){ import_data activity_info "select id, activity_name, activity_type, start_time, end_time, create_time from activity_info where 1=1" } import_activity_rule(){ import_data activity_rule "select id, activity_id, activity_type, condition_amount, condition_num, benefit_amount, benefit_discount, benefit_level from activity_rule where 1=1" } import_base_dic(){ import_data base_dic "select dic_code, dic_name, parent_code, create_time, operate_time from base_dic where 1=1" } import_order_detail_activity(){ import_data order_detail_activity "select id, order_id, order_detail_id, activity_id, activity_rule_id, sku_id, create_time from order_detail_activity where date_format(create_time,'%Y-%m-%d')='$do_date'" } import_order_detail_coupon(){ import_data order_detail_coupon "select id, order_id, order_detail_id, coupon_id, coupon_use_id, sku_id, create_time from order_detail_coupon where date_format(create_time,'%Y-%m-%d')='$do_date'" } import_refund_payment(){ import_data refund_payment "select id, out_trade_no, order_id, sku_id, payment_type, trade_no, total_amount, subject, refund_status, create_time, callback_time from refund_payment where (DATE_FORMAT(create_time,'%Y-%m-%d')='$do_date' or DATE_FORMAT(callback_time,'%Y-%m-%d')='$do_date')" } import_sku_attr_value(){ import_data sku_attr_value "select id, attr_id, value_id, sku_id, attr_name, value_name from sku_attr_value where 1=1" } import_sku_sale_attr_value(){ import_data sku_sale_attr_value "select id, sku_id, spu_id, sale_attr_value_id, sale_attr_id, sale_attr_name, sale_attr_value_name from sku_sale_attr_value where 1=1" } case in "order_info") import_order_info ;; "base_category1") import_base_category1 ;; "base_category2") import_base_category2 ;; "base_category3") import_base_category3 ;; "order_detail") import_order_detail ;; "sku_info") import_sku_info ;; "user_info") import_user_info ;; "payment_info") import_payment_info ;; "base_province") import_base_province ;; "activity_info") import_activity_info ;; "cart_info") import_cart_info ;; "comment_info") import_comment_info ;; "coupon_info") import_coupon_info ;; "coupon_use") import_coupon_use ;; "favor_info") import_favor_info ;; "order_refund_info") import_order_refund_info ;; "order_status_log") import_order_status_log ;; "spu_info") import_spu_info ;; "activity_rule") import_activity_rule ;; "base_dic") import_base_dic ;; "order_detail_activity") import_order_detail_activity ;; "order_detail_coupon") import_order_detail_coupon ;; "refund_payment") import_refund_payment ;; "sku_attr_value") import_sku_attr_value ;; "sku_sale_attr_value") import_sku_sale_attr_value ;; "all") import_base_category1 import_base_category2 import_base_category3 import_order_info import_order_detail import_sku_info import_user_info import_payment_info import_base_trademark import_activity_info import_cart_info import_comment_info import_coupon_use import_coupon_info import_favor_info import_order_refund_info import_order_status_log import_spu_info import_activity_rule import_base_dic import_order_detail_activity import_order_detail_coupon import_refund_payment import_sku_attr_value import_sku_sale_attr_value ;; esac安装hive
1.解压安装
1)把apache-hive-3.1.2-bin.tar.gz上传到linux的/opt/software目录下 2)解压apache-hive-3.1.2-bin.tar.gz到/opt/module/目录下面 [atguigu@hadoop102 software]$ tar -zxvf /opt/software/apache-hive-3.1.2-bin.tar.gz -C /opt/module/ 3)修改apache-hive-3.1.2-bin.tar.gz的名称为hive [atguigu@hadoop102 software]$ mv /opt/module/apache-hive-3.1.2-bin/ /opt/module/hive 4)修改/etc/profile.d/my_env.sh,添加环境变量 [atguigu@hadoop102 software]$ sudo vim /etc/profile.d/my_env.sh 添加内容 #HIVE_HOME export HIVE_HOME=/opt/module/hive export PATH=$PATH:$HIVE_HOME/bin 重启Xshell对话框或者source一下 /etc/profile.d/my_env.sh文件,使环境变量生效 [atguigu@hadoop102 software]$ source /etc/profile.d/my_env.sh 5)解决日志Jar包冲突,进入/opt/module/hive/lib目录 [atguigu@hadoop102 lib]$ mv log4j-slf4j-impl-2.10.0.jar log4j-slf4j-impl-2.10.0.jar.bak
2.hive元数据配置到mysql
#1 拷贝驱动 将MySQL的JDBC驱动拷贝到Hive的lib目录下 [atguigu@hadoop102 lib]$ cp /opt/software/mysql-connector-java-5.1.48.jar /opt/module/hive/lib/ #2 配置metastore到MySQL 在$HIVE_HOME/conf目录下新建hive-site.xml文件 [atguigu@hadoop102 conf]$ vim hive-site.xml # 添加如下内容
javax.jdo.option.ConnectionURL jdbc:mysql://hadoop102:3306/metastore?useSSL=false javax.jdo.option.ConnectionDriverName com.mysql.jdbc.Driver javax.jdo.option.ConnectionUserName root javax.jdo.option.ConnectionPassword 123456/value> hive.metastore.warehouse.dir /user/hive/warehouse hive.metastore.schema.verification false hive.server2.thrift.port 10000 hive.server2.thrift.bind.host hadoop102 hive.metastore.event.db.notification.api.auth false hive.cli.print.header true hive.cli.print.current.db true
3.初始化元数据
#初始化元数据库 1)登陆MySQL [atguigu@hadoop102 conf]$ mysql -uroot -p000000 2)新建Hive元数据库 mysql> create database metastore; mysql> quit; 3)初始化Hive元数据库 [atguigu@hadoop102 conf]$ schematool -initSchema -dbType mysql -verbose 4)启动Hive客户端 [atguigu@hadoop102 hive]$ hive 5)查看一下数据库 hive (default)> show databases; OK database_name default
自此,整个数据采集通道全部打通,测试ok
日志信息:从生成然后自动上传到HDFS
业务数据:从生成然后手动导入HDFS,可结合定时任务自动去导
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)