Java大数据-DataX读取 MySQL 中的数据存放到 HDFS、读取 HDFS 数据写入 MySQL

Java大数据-DataX读取 MySQL 中的数据存放到 HDFS、读取 HDFS 数据写入 MySQL,第1张

Java大数据-DataX读取 MySQL 中的数据存放到 HDFS、读取 HDFS 数据写入 MySQL 一、读取 MySQL 中的数据存放到 HDFS官方模板
[atguigu@hadoop102 ~]$ python /opt/module/datax/bin/datax.py -r mysqlreader -w 
hdfswriter
{
 "job": {
 "content": [
 {
 "reader": {
 "name": "mysqlreader", 
 "parameter": {
 "column": [], 
 "connection": [
 {
 "jdbcUrl": [], 
 "table": []
 }
 ], 
 "password": "", 
 "username": "", 
 "where": ""
}
 }, 
 "writer": {
 "name": "hdfswriter", 
 "parameter": {
 "column": [], 
 "compress": "", 
 "defaultFS": "", 
 "fieldDelimiter": "", 
 "fileName": "", 
 "fileType": "", 
 "path": "", 
 "writeMode": ""
 }
 }
 }
 ], 
 "setting": {
 "speed": {
 "channel": ""
 }
 }
 } }
mysqlreader 参数解析:

hdfswriter 参数解析:

二、读取 MySQL 中的数据存放到 HDFS 1.准备数据 创建 student 表
mysql> create database datax;
mysql> use datax;
mysql> create table student(id int,name varchar(20));

 插入数据

mysql> insert into student values(1001,'zhangsan'),(1002,'lisi'),(1003,'wangwu');
2.编写配置文件
[atguigu@hadoop102 datax]$ vim /opt/module/datax/job/mysql2hdfs.json
{
 "job": {
 "content": [
 {
 "reader": {
 "name": "mysqlreader", 
 "parameter": {
 "column": [
 "id",
 "name"
 ], 
 "connection": [
{
 "jdbcUrl": [
 "jdbc:mysql://hadoop102:3306/datax"
 ], 
 "table": [
 "student"
 ]
 }
 ], 
 "username": "root", 
 "password": "000000"
 }
 }, 
 "writer": {
 "name": "hdfswriter", 
 "parameter": {
 "column": [
 {
 "name": "id",
 "type": "int"
 },
 {
 "name": "name",
 "type": "string"
 }
 ], 
 "defaultFS": "hdfs://hadoop102:9000", 
 "fieldDelimiter": "t", 
 "fileName": "student.txt", 
 "fileType": "text", 
 "path": "/", 
 "writeMode": "append"
 }
 }
 }
 ], 
 "setting": {
 "speed": {
"channel": "1"
 }
 }
 } }
3.执行任务
[atguigu@hadoop102 datax]$ bin/datax.py job/mysql2hdfs.json
2019-05-17 16:02:16.581 [job-0] INFO JobContainer -
任务启动时刻 : 2019-05-17 16:02:04
任务结束时刻 : 2019-05-17 16:02:16
任务总计耗时 : 12s
任务平均流量 : 3B/s
记录写入速度 : 0rec/s
读出记录总数 : 3
读写失败总数 : 0
4.查看hdfs

 注意:HdfsWriter 实际执行时会在该文件名后添加随机的后缀作为每个线程写入实际文件名。

5.关于HA的支持
"hadoopConfig":{
 "dfs.nameservices": "ns",
 "dfs.ha.namenodes.ns": "nn1,nn2",
 "dfs.namenode.rpc-address.ns.nn1": "主机名:端口",
 "dfs.namenode.rpc-address.ns.nn2": "主机名:端口",
"dfs.client.failover.proxy.provider.ns": 
"org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider"
}
三、读取 HDFS 数据写入 MySQL官方模板 1.将上个案例上传的文件改名
[atguigu@hadoop102 datax]$ hadoop fs -mv /student.txt* /student.txt
2.查看官方模板
[atguigu@hadoop102 datax]$ python bin/datax.py -r hdfsreader -w mysqlwriter
{
 "job": {
 "content": [
 {
 "reader": {
 "name": "hdfsreader", 
 "parameter": {
 "column": [], 
 "defaultFS": "", 
 "encoding": "UTF-8", 
 "fieldDelimiter": ",", 
 "fileType": "orc", 
 "path": ""
 }
 }, 
 "writer": {
 "name": "mysqlwriter", 
 "parameter": {
 "column": [], 
 "connection": [
 {
 "jdbcUrl": "", 
 "table": []
 }
 ], 
 "password": "",
"preSql": [], 
 "session": [], 
 "username": "", 
 "writeMode": ""
 }
 }
 }
 ], 
 "setting": {
 "speed": {
 "channel": ""
 }
 }
 } }
 四、读取 HDFS 数据写入 MySQL

 1.创建配置文件
[atguigu@hadoop102 datax]$ vim job/hdfs2mysql.json
{
 "job": {
 "content": [
 {
 "reader": {
 "name": "hdfsreader", 
 "parameter": {
 "column": ["*"], 
 "defaultFS": "hdfs://hadoop102:9000", 
 "encoding": "UTF-8", 
 "fieldDelimiter": "t", 
 "fileType": "text", 
 "path": "/student.txt"
 }
 }, 
 "writer": {
 "name": "mysqlwriter", 
 "parameter": {
 "column": [
 "id",
 "name"
], 
 "connection": [
 {
 "jdbcUrl": "jdbc:mysql://hadoop102:3306/datax", 
 "table": ["student2"]
 }
 ], 
 "password": "000000", 
 "username": "root", 
 "writeMode": "insert"
 }
 }
 }
 ], 
 "setting": {
 "speed": {
 "channel": "1"
 }
 }
 } }
2.在 MySQL 的 datax 数据库中创建 student2
mysql> use datax;
mysql> create table student2(id int,name varchar(20));
3.执行任务
[atguigu@hadoop102 datax]$ bin/datax.py job/hdfs2mysql.json
2019-05-17 16:21:53.616 [job-0] INFO JobContainer -
任务启动时刻 : 2019-05-17 16:21:41
任务结束时刻 : 2019-05-17 16:21:53
任务总计耗时 : 11s
任务平均流量 : 3B/s
记录写入速度 : 0rec/s
读出记录总数 : 3
读写失败总数 : 0
4.查看 student2 表
mysql> select * from student2;
+------+----------+
| id | name |
+------+----------+
| 1001 | zhangsan |
| 1002 | lisi |
| 1003 | wangwu |
+------+----------+
3 rows in set (0.00 sec)

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5697000.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-17
下一篇 2022-12-17

发表评论

登录后才能评论

评论列表(0条)

保存