Spark单机版配置

2017-08-04  本文已影响0人  LuJingyuan

安装scala:

export PATH=/usr/local/scala-2.12.3/bin:$PATH
sudo chown -R mylinux:mylinux hadoop-2.8.0/
sudo chown -R mylinux:mylinux ./scala-2.12.3/

Spark安装后,配置环境变量:

~$ vim .bashrc

export HADOOP_HOME=/usr/local/hadoop-2.8.0
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
export HADOOP_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
# HADOOP PATH END

# SPARK PATH START
export SPARK_HOME=/usr/local/spark
export PATH=$PATH:$SPARK_HOME/bin
# SPARK PATH END

# Scala Path Start
export SCALA_HOME=/usr/local/scala-2.12.3
# Scala Path End

配置conf/spark-env.sh:

cd /usr/local/spark/conf/
vim spark-env.sh
export SPARK_MASTER_HOST=localhost
export SPARK_MASTER_PORT=7066
export SPARK_LOCAL_IP=127.0.0.10
export SPARK_WORKER_CORES=1
export SPARK_WORKER_INSTANCES=1
export SPARK_WORKER_MEMORY=512M

运行Spark

start-dfs.sh
start-yarn.sh
3761 ResourceManager
3383 DataNode
3576 SecondaryNameNode
3883 NodeManager
3260 NameNode
3934 Jps
cd /usr/local/spark/
./sbin/start-all.sh
3761 ResourceManager
4417 Jps
3383 DataNode
4344 Worker
3576 SecondaryNameNode
4250 Master
3883 NodeManager
3260 NameNode

通过 netstat -nlt 命令查看节点网络情况

默认地址:http://localhost:8080/

上一篇 下一篇

猜你喜欢

热点阅读