linux 下安装hive(3.1.2)及配置

linux 下安装hive(3.1.2)及配置,第1张

linux 下安装hive(3.1.2)及配置

    下载安装包或下载源码(自己编译);

    解压安装包

    配置环境变量

    #java
    export JAVA_HOME=/usr/local/jdk1.8.0_202
    export JRE_HOME=${JAVA_HOME}/jre
    export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib:$CLASSPATH
    export JAVA_PATH=${JAVA_HOME}/bin:${JRE_HOME}/bin
    export PATH=$PATH:${JAVA_PATH}
    #hadoop
    export HADOOP_HOME=/usr/local/hadoop-3.3.1
    export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
    #hive
    export HIVE_HOME=/usr/local/hive
    export PATH=$PATH:$HIVE_HOME/bin
    
    

    配置hive/conf/hive-env.sh文件

    # Set HADOOP_HOME to point to a specific hadoop install directory
    HADOOP_HOME=/usr/local/hadoop-3.3.1
    # Hive Configuration Directory can be controlled by:
    export HIVE_CONF_DIR=/usr/local/hive/conf
    # Folder containing extra libraries required for hive compilation/execution can be controlled by:
    export HIVE_AUX_JARS_PATH=/usr/local/hive/lib
    

    配置 hive/conf/hive-site.xml文件

    
    
    
    	
    	    hive.metastore.warehouse.dir
    	    /user/hive/warehouse
    	    location of default database for the warehouse
     	
     	
      	
    	    hive.exec.scratchdir
    	    /user/hive/tmp
    		    HDFS root scratch dir for Hive jobs which gets created with write all (733) permission. For each connecting user, an HDFS scratch dir: ${hive.exec.scratchdir}/<username> is created, with ${hive.scratch.dir.permission}.
     	
     	
     	
    	    hive.querylog.location
    	    /user/hive/log
    	
    	
     	
    		datanucleus.metadata.validate
    		false
    	
    		
    	
    		hive.metastore.schema.verification
    		false
       
    
       
    		datanucleus.schema.autoCreateAll
    		true
    	 
    	 
        
            javax.jdo.option.ConnectionURL
            jdbc:mysql://127.0.0.1:3306/hive?createDatabaseIfNotExist=true  
        
        
        
            javax.jdo.option.ConnectionDriverName
            com.mysql.jdbc.Driver
        
        
        
            javax.jdo.option.ConnectionUserName 
            root 
        
        
        
            javax.jdo.option.ConnectionPassword
            1q2w#E$R
        
    
    

    把mysql-connect-java.jar 复制到 hive/lib下

    启动hadoop集群创建文件夹

    hdfs dfs -mkdir -p /user/hive/warehouse
    hdfs dfs -mkdir -p /user/hive/tmp
    hdfs dfs -mkdir -p /user/hive/log
    hdfs dfs -chmod g+w /user/hive/warehouse
    hdfs dfs -chmod g+w /user/hive/tmp
    hadoop fs -chmod -R 777 /user/hive/tmp
    hdfs dfs -chmod g+w /user/hive/log
    

    初始化数据库(schematool -dbType mysql -initSchema)

    启动metastore ( nohup hive --service metastore > metastore.log 2>&1 &)

    启动远程访问 (nohup hive --service hiveserver2 > hiveserver2.log 2>&1 &)

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5717232.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-18
下一篇 2022-12-17

发表评论

登录后才能评论

评论列表(0条)

保存