说明:
(Hbase依赖于Hadoop,同时需要把元数据存放在mysql中),mysql自行安装
Hadoop2.0安装参考我的博客:
https://www.cnblogs.com/654wangzai321/p/8603498.html
源码包下载:
http://archive.apache.org/dist/hive/hive-2.3.2/
集群环境:
master 192.168.1.99 slave1 192.168.1.100 slave2 192.168.1.101
下载安装包:
# Master wget http://archive.apache.org/dist/hive/hive-2.3.2/apache-hive-2.3.2-bin.tar.gz -C /usr/local/src tar -zxvf apache-hive-2.3.2-bin.tar.gz
mv apache-hive-2.3.2 /usr/local/hive
修改配置文件:
cd /usr/local/hive/conf
vim hive-site.xml(其中hive为数据库,root,hadoop分别为mysql的用户名和密码)
配置环境变量:
#Master slave1 slave2
vim ~/.bashrc HIVE_HOME=/usr/local/hive PATH=$PATH:$HIVE_HOME/bin #刷新环境变量 source ~/.bashrc
下载Mysql连接工具:
wget http://mirrors.ustc.edu.cn/mysql-ftp/Downloads/Connector-J/mysql-connector-java-5.1.46.tar.gz -C /usr/local/src tar -zxvf mysql-connector-java-5.1.46.tar.gz cp mysql-connector-java-5.1.46/mysql-connector-java-5.1.46-bin.jar /usr/local/hive/lib
拷贝安装包:
rsync -av /usr/local/hive/ slave1:/usr/local/hive/ rsync -av /usr/local/hive/ slave2:/usr/local/hive/
启动集群:
hive
原文:https://www.cnblogs.com/654wangzai321/p/9672028.html