- –hive-home
Override $HIVE_HOME - –hive-import import tables into Hive (Uses Hive’s default delimiters if none are set.)
- –hive-overwrite Overwrite existing data in the Hive table.
- –create-hive-table If set, then the job will fail if the target hive table exits. By default this property is false.
- –hive-table Sets the table name to use when importing to Hive.
- –hive-drop-import-delims Drops n, r, and 1 from string fields when importing to Hive.
- –hive-delims-replacement Replace n, r, and 1 from string fields with user defined string when importing to Hive.
- –hive-partition-key Name of a hive field to partition are sharded on
- –hive-partition-value String-value that serves as partition key for this imported into hive in this job.
- –map-column-hive
import --connect jdbc:mysql://node1:3306/result_db --username root --password 123456 --as-textfile --target-dir /sqoop/hive --delete-target-dir -m 1 -e select id,browser_name,browser_version from dimension_browser where $ConDITIONS and id >20 --hive-import --create-hive-table --hive-table browser --fields-terminated-by ,
Node2 和 node3 启动 hiveserver2 执行:
nohup hiveserver2 &
node3 命令行:
sqoop --options-file sqoop5.txt
或者
[root@node3 ~]# sqoop import --connect jdbc:mysql://node1:3306/result_db --username root --password 123456 --as-textfile --target-dir /sqoop/hive --delete-target-dir -m 1 -e 'select id,browser_name,browser_version from dimension_browser where $ConDITIONS and id >20' --hive-import --create-hive-table --hive-table browser --fields-terminated-by ','
node4:
[root@node3 ~]# cat -A part-m-00000 41,360,0$ 42,360,1$ 43,360,2$ 44,360,3$ 45,360,4$ 46,360,5$ 47,360,6$ 48,360,7$ 49,360,8$ 50,360,all$ 51,Chrome,0$ 52,Chrome,1$ 53,Chrome,2$ 54,Chrome,3$ 55,Chrome,4$ 56,Chrome,5$ 57,Chrome,6$ 58,Chrome,7$ 59,Chrome,8$ 60,Chrome,all$ 61,FireFox,0$
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)