spark单一部署版
再记录一下spark的单机版部署
最后可以跑通spark自带的例子,不过可能有些参数还需要调整
假设你已经部署好了hadoop
首先下载scala和spark
scala: http://www.scala-lang.org/download/ spark: http://spark.apache.org/downloads.html
然后先要安装scala:
scale的安装很简单,解压后,配置一下环境变量就可以直接使用 修改/etc/profile export SCALA_HOME=/usr/lib/scala-2.9.3 export PATH=$PATH:$SCALA_HOME/bin 测试: source /etc/profile scala -version
安装好之后再安装spark,也是无需安装,关键是配置:
修改/etc/profile export SPARK_EXAMPLES_JAR=${SPARK_HOME}/lib/spark-examples-1.3.1-hadoop2.6.0.jar export SPARK_HOME=/usr/local/spark export PATH=$PATH:${SPARK_HOME}/bin 修改spark-env.sh cp spark-env.sh.template spark-env.sh 添加如下内容: export SPARK_LAUNCH_WITH_SCALA=0 export SPARK_LIBRARY_PATH=${SPARK_HOME}/lib export SCALA_LIBRARY_PATH=${SPARK_HOME}/lib export SPARK_MASTER_WEBUI_PORT=18080 export SPARK_MASTER_IP=localhost#需要配置一下,而且要跟spark-env.sh一致 export SPARK_MASTER_PORT=7077 export SPARK_WORKER_PORT=7078 export SPARK_WORKER_WEBUI_PORT=18081 export SPARK_WORKER_DIR=${SPARK_HOME}/work export HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop export SPARK_LOCAL_IP=2.2.2.2 export SPARK_LOCAL_DIRS=${SPARK_HOME}/data 修改spark-defaults.conf cp spark-defaults.conf.template spark-defaults.conf spark.eventLog.dir=/user/spark/applicationHistory spark.eventLog.enabled=true spark.master=spark://localhost:7077 spark.eventLog.dir=/user/spark/applicationHistory spark.eventLog.enabled=true spark.yarn.historyServer.address=http://localhost:19888
都改好之后可以试试启动spark服务
${SPARK_HOME}/sbin/start-master.sh ${SPARK_HOME}/sbin/start-slaves.sh 如果不能启动的话,可能是spark-class有些配置没弄好 正常启动后运行 ${SPARK_HOME}/bin/run-example SparkPi 10 会输出 Pi is roughly 3.14366
这里讲的都是基本的spark配置和例子的运行,接下来会使用更多的例子测试
相关推荐
Johnson0 2020-07-28
Hhanwen 2020-07-26
zhixingheyitian 2020-07-19
yanqianglifei 2020-07-07
Hhanwen 2020-07-05
Hhanwen 2020-06-25
rongwenbin 2020-06-15
sxyhetao 2020-06-12
hovermenu 2020-06-10
Oeljeklaus 2020-06-10
zhixingheyitian 2020-06-08
Johnson0 2020-06-08
zhixingheyitian 2020-06-01
xclxcl 2020-05-31
Hhanwen 2020-05-29
zhixingheyitian 2020-05-29
Oeljeklaus 2020-05-29