ubuntu16.04本地配置spark history ser

2018-09-07  本文已影响0人  WJXZ
1.安装spark
2.配置spark-defaults.conf
cd /usr/share/spark/spark-2.2.2-bin-hadoop2.7/conf
sudo cp spark-defaults.conf.template  spark-defaults.conf
sudo vim spark-defaults.conf
#末尾添加
spark.eventLog.enabled           true
spark.eventLog.dir               file:/usr/share/spark/spark-2.2.2-bin-hadoop2.7/spark-logs
spark.history.fs.logDirectory    file:/usr/share/spark/spark-2.2.2-bin-hadoop2.7/spark-logs
#保存退出 :wq
3.新建spark-logs日志文件夹
cd /usr/share/spark/spark-2.2.2-bin-hadoop2.7
sudo mkdir spark-logs
4.测试
#运行
pyspark

访问http://localhost:4040
看到spark页面则配置成功
若有多个sparkContext 则端口会从4040开始递增

上一篇 下一篇

猜你喜欢

热点阅读