25.Spark下载源码和安装和使用
安装scala
上传安装包
解压
配置scala相关的环境变量
export SCALA_HOME=/opt/modules/scala-2.11.4 export PATH=$PATH:$SCALA_HOME/bin
验证scala安装是否成功
把scala分发给node2 node3 node4
scp -r scala-2.11.4/ :/opt/modules/ scp -r scala-2.11.4/ :/opt/modules/ scp -r scala-2.11.4/ :/opt/modules/
分别给node2 node3 node4配置scala的环境变量,并使其生效
#scala export SCALA_HOME=/opt/modules/scala-2.11.4 export PATH=$PATH:$SCALA_HOME/bin
spark安装包下载地址:https://archive.apache.org/dist/spark/spark-1.5.1/
上传安装包导集群
解压安装包
配置spark的环境变量
#spark export SPARK_HOME=/opt/modules/spark-1.5.1-bin-hadoop2.6 export PATH=$PATH:$SPARK_HOME/bin export CLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib:$JAVA_HOME/jre/lib
使环境变量生效
修改spark配置文件
修改spark-env.sh文件
vim spark-env.sh文件
添加以下语句
export JAVA_HOME=/opt/modules/jdk1.8.0_65 export SCALA_HOME=/opt/modules/scala-2.11.4 export SPARK_MASTER_IP=node1 export SPARK_WORKER_MEMORY=1g export HADOOP_CONF_DIR=/opt/modules/hadoop-2.6.0/etc/hadoop
修改slaves文件
vim slaves
将spark安装包分发给node2 node3 node4
scp -r spark-1.5.1-bin-hadoop2.6/ :/opt/modules/ scp -r spark-1.5.1-bin-hadoop2.6/ :/opt/modules/ scp -r spark-1.5.1-bin-hadoop2.6/ :/opt/modules/
再给node2 node3 node4配置spark的环境变量
#spark export SPARK_HOME=/opt/modules/spark-1.5.1-bin-hadoop2.6 export PATH=$PATH:$SPARK_HOME/bin export CLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib:$JAVA_HOME/jre/lib
在spark目录下的sbin目录执行./start-all.sh
在浏览器打开地址http://node1:8080/
启动spark-shell
启动成功!!!!
相关推荐
瓜牛呱呱 2020-11-12
柳木木的IT 2020-11-04
yifouhu 2020-11-02
lei0 2020-11-02
源码zanqunet 2020-10-28
源码zanqunet 2020-10-26
一叶梧桐 2020-10-14
码代码的陈同学 2020-10-14
lukezhong 2020-10-14
lzzyok 2020-10-10
anchongnanzi 2020-09-21
clh0 2020-09-18
changcongying 2020-09-17
星辰大海的路上 2020-09-13
abfdada 2020-08-26
mzy000 2020-08-24
shenlanse 2020-08-18
zhujiangtaotaise 2020-08-18
xiemanR 2020-08-17