安装

2017-07-07  本文已影响8人  ifeelok0319

配置网卡步骤:

第一步:

ifconfig -a

找到网卡的硬件地址

第二步:

cd /etc/sysconfig/network-scripts/
cp ifcfg-eth0 ifcfg-eth1
# 修改DEVICE和HWADDR
vi ifcfg-eth1
service network restart

修改yum源

关闭防火墙

1、永久生效

chkconfig iptables on
chkconfig iptables off

2、临时生效

service iptables start
service iptables stop

hadoop集群安装

编译hadoop源代码

因为hadoop为32位,生产环境服务器为64位,需要在本地编译使用。使用maven编译。

修改hostname

vi /etc/sysconfig/network

修改HOSTNAME=hadoop01

三台机器都要修改,分别位hadoop01/02/03

修改之后要重启

reboot

修改hosts

vi /etc/hosts

192.168.0.1 hadoop-node-01 hadoop01
192.168.0.2 hadoop-node-02 hadoop02
192.168.0.3 hadoop-node-03 hadoop03

配置免密码登陆

# 生成id_rsa-私钥,id_rsa.pub-公钥
ssh-keygen -t rsa
# 拷贝公钥到别的机器上
ssh-copy-id hadoop02
ssh-copy-id hadoop03

hadoop安装

目录划分
# 数据
/export/data
# 解压包
/export/servers
# 安装包
/export/software

mv hadoop-2.6.2.tar.gz /export/software
cd /export/software
tar -zxvf hadoop-2.6.2.tar.gz -C /export/servers/
cd /export/servers
# 创建软链接
ln -s hadoop-2.6.2 hadoop
配置环境变量
vi /etc/profile

# set hadoop env
export HADOOP_HOME=/export/servers/hadoop
export PATH=${HADOOP_HOME}/bin:${HADOOP_HOME}/bin:$PATH
修改hadoop配置文件

hadoop-env.sh

exiport JAVA_HOME=/export/servers/jdk

hdfs-site.xml

<property>
<name>dfs.replication</name>
<value>2</value> //默认为3
</property>

mapred-env.sh

exiport JAVA_HOME=/export/servers/jdk

mapred-site.xml

<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>

yarn-site.xml

<property>
<name>yarn.resourcemanager.hostname</name>
<value>hadoop01</value>
</property>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>

core-site.xml

<property>
<name>fs.defaultFS</name>
<value>hdfs://hadoop01:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/export/data/hadoop/tmp</value>
</property>

slaves

hadoop02
hadoop03

配置完成之后进行分发

hadoop启动

启动hdfs

start-dfs.sh
jps

启动mapreduce,即yarn

start-yarn.sh
jps

hadoop的使用

# 将test.txt放入hdfs的根目录
hadoop fs -put test.txt /
# 查看hdfs根目录
hadoop fs -ls /
hadoop jar /export/servers/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.1.jar wordcount /test.txt /result
hadoop fs -ls /result
hadoop fs -cat /result part-r-00000

总结

上一篇下一篇

猜你喜欢

热点阅读