下载scala-2.12.3.tgz
tar -zvxf scala-2.12.3.tgz -C /usr/local/
su root
cd /usr/local/
mv scala-2.12.3/ scala/
修改环境变量 /ect/profile
SCALA_HOME=/usr/local/scala-2.12.3
PATH=$PATH:$SCALA_HOME/bin
export SCALA_HOME PATH
直接运行scala,查看即可。
Preference -> Plugins->scala
安装即可。
关掉IJ重新启动,
找到View->Tool Window->scala
直接在IJ的terminal中运行scala即可
scala 需要先创建一个new project scala ,对IJ里的scala进行设置。
下载sbt-0.13.13.zip(1.1M)
tar -zxvf /Users/linkedme/Downloads/sbt-0.13.13.zip -C /usr/local/
su root
cd /usr/local/
mv sbt-launcher-packaging-0.13.13/ sbt/
修改环境变量 /ect/profile
SBT_HOME=/usr/local/sbt
PATH=$PATH:$SBT_HOME/bin
export SBT_HOME PATH
执行sbt即可。
注意: 这里遇到了很多坑,比如最开始装的sbt-0.13.16.zip(60.5M)
[warn] No sbt.version set in project/build.properties, base directory: /usr/local
java.lang.IllegalArgumentException: requirement failed: Source file ‘/usr/local/Homebrew/Library/Homebrew/os/mac/pkgconfig/fuse/fuse.pc’ does not exist.这个问题始终没有办法解决,所以就换版本了…
Preference -> Plugins->sbt
安装即可。
关掉IJ重新启动,
找到View->Tool Window->SBT
直接在IJ的terminal中运行sbt即可
下载spark-2.1.0-bin-hadoop2.7.tgz
tar -zvxf /Users/linkedme/Downloads/spark-2.1.0-bin-hadoop2.7.tgz -C /usr/local/
su root
cd /usr/local/
mv spark-2.1.0-bin-hadoop2.7/ spark/
修改 spark-env.sh
mv spark-env.sh.template spark-env.sh
vi spark-env.sh
spark-env.sh 如下:
export SCALA_HOME=/usr/local/scala-2.12.3/
export SPARK_MASTER_IP=192.168.31.104
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_131.jdk/Contents/Home
export SPARK_WORKER_MEMORY=512m
export master=spark://192.168.31.104:7070
修改 slaves.template 添加信息
vi slaves.template
//添加如下信息
master
修改spark-defaults.conf
mv spark-defaults.conf.template spark-defaults.conf
vi spark-defaults.conf
spark-defaults.conf 如下:
spark.master spark://localhost:7077
#spark.eventLog.enabled true
#spark.eventLog.dir hdfs://namenode:8021/directory
spark.serializer org.apache.spark.serializer.KryoSerializer
spark.driver.memory 5g
spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"
修改环境变量 /etc/profile:
export SPARK_HOME=/usr/local/spark
export PATH=$PATH:$SPARK_HOME/bin
在 spark/sbin/下 运行./start-all.sh
用jps查看 master 和 worker 启动是否正常
打开Spark Master 页面查看集群情况
http://localhost:8080/
cd ../bin/
运行 ./spark-shell
即可正常跑spark的shell命令了!!!!!
停止spark
进入spark的sbin目录,执行命令
./stop-all.sh
/Library/Java/JavaVirtualMachines/jdk1.8.0_131.jdk/Contents/Home