大数据,机器学习,人工智能

hadoop伪节点自动配置

2018-11-22  本文已影响2人  john不哭

hadoop伪节点配置方式

本文出发点
超级傻瓜式配置Hadoop伪节点环境,99%的自动化部署。只需要将文中的代码保存为xx.sh,然后执行sh xx.sh以shell的方式解决环境配置。

参考资料:

  1. 官网资料:http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/SingleCluster.html#Pseudo-Distributed_Operation
  2. 教育网源:
备注:
  1. 所有修改文件方式按照官网进行配置,系统文件修改会进行backup,备份名字为xxx.backup。
  2. 所有源修改为教育网资源。
    中国互联网络信息中心:http://mirrors.cnnic.cn/apache/
    华中科技大学:http://mirrors.hust.edu.cn/apache/
    北京理工大学:http://mirror.bit.edu.cn/apache/
    Apache官方中国地区镜像列表:http://www.apache.org/mirrors/#cn
    摘自:https://www.cnblogs.com/jtlgb/p/5702713.html
    ============================).(================================

存在的坑:

============================).(================================

包含配置以下环境

具体步骤

1. bash 全自动化安装脚本能够初始化并启动hdfs服务。配置方式按照官网进行配置,hdfs ui地址为 master:50070。(已测试)

#Edited by john ,if you have any problem contact me by 18689235591@163.com.
#install java env

yum install -y java-1.7*

temp=$(which java)

temp=$(ls -l $temp|awk '{print $11}')

temp=$(ls -l $temp|awk '{print $11}')

JAVA_HOME=$(echo "export JAVA_HOME="$temp | awk -F '/' '{print $1"/"$2"/"$3"/"$4"/"$5}')

cp /etc/profile /etc/profile.temp-before-java

echo $JAVA_HOME >> /etc/profile

source /etc/profile

#download hadoop tarball and unzip

#wget https://www-eu.apache.org/dist/hadoop/common/hadoop-2.8.5/hadoop-2.8.5.tar.gz

wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-2.8.5/hadoop-2.8.5.tar.gz

tar -xzvf ./hadoop-2.8.5.tar.gz

cd ./hadoop-2.8.5

#config the env in Pseudo-Distributed way

#${JAVA_HOME} link

var=$(echo $JAVA_HOME |sed 's#\/#\\\/#g')

sed -i 's/${JAVA_HOME}/fxxxxxxx/g' ./etc/hadoop/hadoop-env.sh

sed -i "s/fxxxxxxx/$var/g" ./etc/hadoop/hadoop-env.sh

#config the xml

mv ./etc/hadoop/core-site.xml ./etc/hadoop/core-site.xml.backup

cat ./etc/hadoop/core-site.xml.backup|grep -v "configuration" > ./etc/hadoop/core-site.xml

echo "<configuration>

    <property>

        <name>fs.defaultFS</name>

        <value>hdfs://localhost:9000</value>

    </property>

</configuration>">>./etc/hadoop/core-site.xml

mv ./etc/hadoop/hdfs-site.xml ./etc/hadoop/hdfs-site.xml.backup

cat ./etc/hadoop/hdfs-site.xml.backup|grep -v "configuration" > ./etc/hadoop/hdfs-site.xml

echo "<configuration>

    <property>

        <name>dfs.replication</name>

        <value>1</value>

    </property>

</configuration>">>./etc/hadoop/hdfs-site.xml

# ssh config

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa

cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

chmod 0600 ~/.ssh/authorized_keys

#start the hdfs

./bin/hdfs namenode -format

./sbin/start-dfs.sh`

Fxxxxx生活

就会出现

HDFS-web-ui.png

2. 启动yarn,yarn web ui 地址为master:8088。(未测试脚本)

#Edited by john ,if you have any problem contact me by 18689235591@163.com.
echo "<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
</configuration>">>./etc/hadoop/mapred-site.xml
mv ./etc/hadoop/yarn-site.xml ./etc/hadoop/yarn-site.xml.backup
cat ./etc/hadoop/yarn-site.xml.backup|grep -v "configuration" > ./etc/hadoop/yarn-site.xml.xml
echo "<configuration>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
</configuration>">>./etc/hadoop/yarn-site.xml
./sbin/start-yarn.sh

然后

Yarn-web-ui.png

3.Hive配置

wget http://mirror.bit.edu.cn/apache/hive/hive-2.3.4/apache-hive-2.3.4-bin.tar.gz
tar -xzvf apache-hive-2.3.4-bin.tar.gz
cd ./apache-hive-2.3.4-bin
var=$(pwd)
cp /etc/profile /etc/profile.temp_before_hive
echo export HIVE_HOME=$var >> /etc/profile
cd ../hadoop-2.8.5
var=$(pwd)
cp /etc/profile /etc/profile.temp_before_hadoop
echo export HADOOP_HOME=$var >> /etc/profile
cd ../apache-hive-2.3.4-bin
source /etc/profile
export PATH=$HIVE_HOME/bin:$PATH
$HADOOP_HOME/bin/hadoop fs -mkdir       /tmp
$HADOOP_HOME/bin/hadoop fs -mkdir       /user
$HADOOP_HOME/bin/hadoop fs -mkdir       /user/hive
$HADOOP_HOME/bin/hadoop fs -mkdir       /user/hive/warehouse
$HADOOP_HOME/bin/hadoop fs -chmod g+w   /tmp
$HADOOP_HOME/bin/hadoop fs -chmod g+w   /user/hive/warehouse
$HIVE_HOME/bin/schematool -dbType derby -initSchema
$HIVE_HOME/bin/beeline -u jdbc:hive2://
测骚成功
上一篇 下一篇

猜你喜欢

热点阅读