spark的安装,网上的资料一直不多,无奈
此下语言环境的搭建看个人系统,具体是spark的部分的搭建
1.安装 JDK 1.7
-
yum search openjdk-devel
-
sudo yum install java-1.7.0-openjdk-devel.x86_64
-
/usr/sbin/alternatives --config java
-
/usr/sbin/alternatives --config javac
-
sudo vim /etc/profile
-
# add the following lines at the end
-
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.19.x86_64
-
export JRE_HOME=$JAVA_HOME/jre
-
export PATH=$PATH:$JAVA_HOME/bin
-
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
-
# save and exit vim
-
# make the bash profile take effect immediately
-
$ source /etc/profile
-
# test
-
$ java -version
复制代码
2. 安装 Scala 2.9.3
Spark 0.7.2 依赖 Scala 2.9.3, 我们必须要安装Scala 2.9.3.
并 保存到home目录.
-
$ tar -zxf scala-2.9.3.tgz
-
$ sudo mv scala-2.9.3 /usr/lib
-
$ sudo vim /etc/profile
-
# add the following lines at the end
-
export SCALA_HOME=/usr/lib/scala-2.9.3
-
export PATH=$PATH:$SCALA_HOME/bin
-
# save and exit vim
-
#make the bash profile take effect immediately
-
source /etc/profile
-
# test
-
$ scala -version
复制代码
3. 下载预编译好的Spark
4. 本地模式4.1
解压
$ tar -zxf spark-0.7.2-prebuilt-hadoop1.tgz
4.2 设置SPARK_EXAMPLES_JAR 环境变量
-
$ vim ~/.bash_profile
-
# add the following lines at the end
-
export SPARK_EXAMPLES_JAR=$HOME/spark-0.7.2/examples/target/scala-2.9.3/spark-examples_2.9.3-0.7.2.jar
-
# save and exit vim
-
#make the bash profile take effect immediately
-
$ source /etc/profile
复制代码
4.3 (可选)设置 SPARK_HOME环境变量,并将SPARK_HOME/bin加入PATH
-
$ vim ~/.bash_profile
-
# add the following lines at the end
-
export SPARK_HOME=$HOME/spark-0.7.2
-
export PATH=$PATH:$SPARK_HOME/bin
-
# save and exit vim
-
#make the bash profile take effect immediately
-
$ source /etc/profile
复制代码
4.4 现在可以运行SparkPi了
-
$ cd ~/spark-0.7.2
-
$ ./run spark.examples.SparkPi local
复制代码
5. 集群模式5.1
安装Hadoop
用VMware Workstation 创建三台CentOS 虚拟机,hostname分别设置为 master, slave01, slave02,设置SSH无密码登陆,安装hadoop,然后启动hadoop集群。参考:
hadoop2.2完全分布式最新高可靠安装文档
5.2 Scala
在三台机器上都要安装 Scala 2.9.3 , 按照第2节的步骤。JDK在安装Hadoop时已经安装了。
5.3 在master上安装并配置Spark
解压
-
$ tar -zxf spark-0.7.2-prebuilt-hadoop1.tgz
-
复制代码
设置SPARK_EXAMPLES_JAR 环境变量
-
$ vim ~/.bash_profile
-
# add the following lines at the end
-
export SPARK_EXAMPLES_JAR=$HOME/spark-0.7.2/examples/target/scala-2.9.3/spark-examples_2.9.3-0.7.2.jar
-
# save and exit vim
-
#make the bash profile take effect immediately
-
$ source /etc/profile
复制代码
在 in conf/spark-env.sh中设置SCALA_HOME
-
$ cd ~/spark-0.7.2/conf
-
$ mv spark-env.sh.template spark-env.sh
-
$ vim spark-env.sh
-
# add the following line
-
export SCALA_HOME=/usr/lib/scala-2.9.3
-
# save and exit
复制代码
在conf/slaves, 添加Spark worker的hostname, 一行一个。
-
$ vim slaves
-
slave01
-
slave02
-
# save and exit
复制代码
(可选)设置 SPARK_HOME环境变量,并将SPARK_HOME/bin加入PATH
-
$ vim ~/.bash_profile
-
# add the following lines at the end
-
export SPARK_HOME=$HOME/spark-0.7.2
-
export PATH=$PATH:$SPARK_HOME/bin
-
# save and exit vim
-
#make the bash profile take effect immediately
-
$ source /etc/profile
复制代码
5.4 在所有worker上安装并配置Spark
既然master上的这个文件件已经配置好了,把它拷贝到所有的worker。注意,三台机器spark所在目录必须一致,因为master会登陆到worker上执行命令,master认为worker的spark路径与自己一样。
-
$ cd
-
$ scp -r spark-0.7.2 dev@slave01:~
-
$ scp -r spark-0.7.2 dev@slave02:~
复制代码
按照第5.3节设置SPARK_EXAMPLES_JAR环境变量,配置文件不用配置了,因为是直接从master复制过来的,已经配置好了。
5.5 启动 Spark 集群
在master上执行
-
$ cd ~/spark-0.7.2
-
$ bin/start-all.sh
复制代码
检测进程是否启动
-
$ jps
-
11055 Jps
-
2313 SecondaryNameNode
-
2409 JobTracker
-
2152 NameNode
-
4822 Master
复制代码
-
$ cd ~/spark-0.7.2
-
$ ./run spark.examples.SparkPi spark://master:7077
复制代码
(可选)运行自带的例子,SparkLR 和 SparkKMeans.
-
#Logistic Regression
-
#./run spark.examples.SparkLR spark://master:7077
-
#kmeans
-
$ ./run spark.examples.SparkKMeans spark://master:7077 ./kmeans_data.txt 2 1
复制代码
5.7 从HDFS读取文件并运行WordCount
-
$ cd ~/spark-0.7.2
-
$ hadoop fs -put README.md .
-
$ MASTER=spark://master:7077 ./spark-shell
-
scala> val file = sc.textFile("hdfs://master:9000/user/dev/README.md")
-
scala> val count = file.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey(_+_)
-
scala> count.collect()
复制代码
5.8 停止 Spark 集群
-
$ cd ~/spark-0.7.2
-
$ bin/stop-all.sh
复制代码