Mac install spark env

2022-03-29  本文已影响0人  allenhaozi
  1. install jdk
  2. install scala
  3. install spark
# scala
export SCALA_HOME="/Users/mahao/local/scala/scala3-3.1.1"

# spark
export SPARK_HOME="/Users/mahao/local/spark-3.1.3-bin-hadoop2.7"                                                                                           

PATH="${SCALA_HOME}/bin:${SPARK_HOME}/bin:${PATH}"

test

$spark-shell
22/03/29 14:19:32 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://bogon:4040
Spark context available as 'sc' (master = local[*], app id = local-1648534779439).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.1.3
      /_/
         
Using Scala version 2.12.10 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_231)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 

上一篇 下一篇

猜你喜欢

热点阅读