hadoop开发环境
2017-12-17 本文已影响16人
crystalcd
Hadoop伪分布式环境搭建
1. 设置ssh免密登陆
1、生成秘钥对
ssh‐keygen ‐t rsa(直接回车)
2、查看隐藏目录
ll ‐a /root
cd .ssh
3、创建authorized_keys
touch authorized_keys
4、赋权‐‐‐wr‐‐‐‐‐‐
chmod 600 authorized_keys
5、把id_rsa.pub追加到authorized_keys
cat id_rsa.pub >> authorized_keys
2. 上传jdk 配置jdk环境变量
JAVA_HOME=/opt/jdk1.7.0_80
JRE_HOME=/opt/jdk1.7.0_80/jre
CLASS_PATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JRE_HOME/lib
PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin:/usr/local/bin
export JAVA_HOME JRE_HOME CLASS_PATH PATH
3. 上传Hadoop 配置Hadoop环境变量
export HADOOP_HOME=/opt/hadoop-2.4.1
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOOME/sbin:$HADOOP_HOME/lib
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
4. 配置hadoop配置文件
1. 修改hadoop-env.sh
修改JAVA_HOME=${JAVA_HOME} 直接去掉后面的填入JAVA_HOME地址
2. core‐site.xml(公共参数配置)
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://hadoop:9000/</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/opt/hadoop-2.4.1/data/</value>
</property>
</configuration>
3. hdfs‐site.xml(hdfs配置)
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
</configuration>
4. mapred‐site.xml
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
5. yarn‐site.xml
<configuration>
<!-- Site specific YARN configuration properties -->
<property>
<name>yarn.resourcemanager.hostname</name>
<value>hadoop</value>
</property>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
</configuration>