操作系统:Linux(CentOS 7.0)
下载
Java(jdk-8u111-linux-x64.rpm)
Hive2.1.1(apache-hive-2.1.1-bin.tar.gz jdk-8u111-linux-x64.rpm)
Hadoop2.7.3(hadoop-2.7.3.tar.gz)
下载Java(JDK)
下载安装在官网可下载最新版(Hadoop/Hive)
点击打开链接
把下载文件存放在CentOS桌面文件(Hadoop)
[root@localhost Hahoop]# yuminstall -y jdk-8u111-linux-x64.rpm查看安装后的版本
[root@localhost Hahoop]# java -version
java version "1.8.0_111"
Java(TM) SE Runtime Environment (build1.8.0_111-b14)
Java HotSpot(TM) 64-Bit Server VM(build 25.111-b14, mixed mode)
[root@localhost Hahoop]# tar -xzfhadoop-2.7.3.tar.gz [root@localhost Hahoop]# tar -xzfapache-hive-2.1.1-bin.tar.gz查看解压文件夹
[root@localhost Hahoop]# lsapache-hive-2.1.1-bin hadoop-2.7.3 jdk-8u111-linux-x64.rpm
apache-hive-2.1.1-bin.tar.gz hadoop-2.7.3.tar.gz
[root@localhost Hahoop]# mvhadoop-2.7.3 /usr/Hadoop [root@localhost Hahoop]# mvapache-hive-2.1.1-bin /usr/hive
[root@localhost hadoop]# vim~/.bashrc
添加:
# set hadoop/hive/jdk(java) path export HADOOP_HOME=/usr/hadoop export HIVE_HOME=/usr/hive export JAVA_HOME=/usr/java/jdk1.8.0_111 export PATH="$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$HIVE_HOME/bin:$JAVA_HOME/bin"
[root@localhost hadoop]# source~/.bashrc
[root@localhost hadoop]# cd /usr/Hadoop [root@localhost hadoop]# mkdir tmp [root@localhost hadoop]# mkdir hdfs [root@localhost hadoop]# mkdir hdfs/data [root@localhost hadoop]# mkdir hdfs/name
[root@localhost hadoop]# cd/usr/hadoop/etc/Hadoop
[root@localhost hadoop]# vimhadoop-env.sh
#export JAVA_HOME=${JAVA_HOME} export JAVA_HOME=/usr/java/jdk1.8.0_111
[root@localhost hadoop]# vim yarn-env.sh
# export JAVA_HOME=/home/y/libexec/jdk1.6.0/ exportJAVA_HOME=/usr/java/jdk1.8.0_111
[root@localhost hadoop]# vimcore-site.xml
<configuration> <property> <name>fs.default.name</name> <value>hdfs://localhost:9000</value> <description>HSDF的URL,文件系统:namenode标识:端口号</description> </property> <property> <name>hadoop.tmp.dir</name> <value>/usr/hadoop/tmp</value> <description>本地hadoop临时文件夹</description> </property> </configuration>
[root@localhost hadoop]# vim hdfs-site.xml
<configuration> <property> <name>dfs.name.dir</name> <value>/usr/hadoop/hdfs/name</value> <description>namenode上存储hdfs名字空间元数据</description> </property> <property> <name>dfs.data.dir</name> <value>/usr/hadoop/hdfs/data</value> <description>datanode上数据块的物理存储位置</description> </property> <!--指定HDFS副本的数量--> <property> <name>dfs.replication</name> <value>1</value> <description>副本个数,默认3应该小于datanode机器数量</description> </property> </configuration>
[root@localhost hadoop]# vim yarn-site.xml
<configuration> <property> <name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value> </property> <property> <name>yarn.resourcemanager.webapp.address</name> <value>localhost:8099</value> </property> </configuration>
[root@localhost hadoop]# mvmapred-site.xml.template mapred-site.xml [root@localhost hadoop]# vim mapred-site.xml
<configuration> <property> <name>mapreduce.framework.name</name> <value>yarn</value> <!--客户端访问为yarn--> </property> </configuration>
[root@localhost hive]# ssh-keygen -t rsa-p‘‘ -f ~/.ssh/id_rsa [root@localhost hive]# cat ~/.ssh/id_rsa.pub>> ~/.ssh/authorized_keys
[root@localhost hadoop]# bin/hdfs namenode-format
[root@localhost hadoop]# sbin/start-dfs.sh
[root@localhost hadoop]# sbin/start-yarn.sh
[root@localhost hadoop]# jps
26161 DataNode
26021 NameNode
26344 SecondaryNameNode
26890 Jps
26492 ResourceManager
26767 NodeManager
搭建Hadoop2.7.3+Hive2.1.1及MySQL(配置Hadoop)(一)
原文:http://blog.csdn.net/roy_88/article/details/54944672