博客
关于我
Spark安装部署
阅读量:179 次
发布时间:2019-02-28

本文共 2801 字,大约阅读时间需要 9 分钟。

一 下载Scala和Spark
[root@master opt]# wget http://downloads.lightbend.com/scala/2.11.8/scala-2.11.8.tgz[root@master opt]# wget http://d3kbcqa49mib13.cloudfront.net/spark-2.0.0-bin-hadoop2.7.tgz
二 安装Scala
1 解压
[root@master opt]# tar -zxvf scala-2.11.8.tgz
2 配置环境变量
export SCALA_HOME=/opt/scala-2.11.8export PATH=$PATH:$SCALA_HOME/bin
3 测试
[root@master opt]# scalaWelcome to Scala 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_152).Type in expressions for evaluation. Or try :help.scala>
三 安装Spark
1 解压
[root@master opt]# tar -zxvf spark-2.0.0-bin-hadoop2.7.tgz
2 配置环境变量
export SPARK_HOME=/opt/spark-2.0.0-bin-hadoop2.7export PATH=$PATH:$SPARK_HOME/bin
3 配置spark-env.sh
export JAVA_HOME=/opt/jdk1.8export PATH=$PATH:$JAVA_HOME/binexport SCALA_HOME=/opt/scala-2.11.8export PATH=$PATH:$SCALA_HOME/binexport SPARK_HOME=/opt/spark-2.0.0-bin-hadoop2.7export PATH=$PATH:$SPARK_HOME/bin
四 启动
[root@master sbin]# ./start-all.shstarting org.apache.spark.deploy.master.Master, logging to /opt/spark-2.0.0-bin-hadoop2.7/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.outlocalhost: \Slocalhost: Kernel \r on an \mlocalhost: starting org.apache.spark.deploy.worker.Worker, logging to /opt/spark-2.0.0-bin-hadoop2.7/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-master.out[root@master sbin]# jps4128 Jps4049 Worker3992 Master
五 测试
[root@master ~]# cat test.loghello gojavac mysql[root@master sbin]# spark-shellUsing Spark's default log4j profile: org/apache/spark/log4j-defaults.propertiesSetting default log level to "WARN".To adjust logging level use sc.setLogLevel(newLevel).18/02/03 22:25:05 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable18/02/03 22:25:08 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.Spark context Web UI available at http://192.168.0.110:4040Spark context available as 'sc' (master = local[*], app id = local-1517667907847).Spark session available as 'spark'.Welcome to      ____              __     / __/__  ___ _____/ /__    _\ \/ _ \/ _ `/ __/  '_/   /___/ .__/\_,_/_/ /_/\_\   version 2.0.0      /_/         Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_152)Type in expressions to have them evaluated.Type :help for more information.scala> var file = sc.textFile("/root/test.log");file: org.apache.spark.rdd.RDD[String] = /root/test.log MapPartitionsRDD[1] at textFile at 
:24scala> file.collectres1: Array[String] = Array(hello go, java, c mysql, "", "")scala> var file = sc.textFile("hdfs://master/test.log");file: org.apache.spark.rdd.RDD[String] = hdfs://master/test.log MapPartitionsRDD[3] at textFile at
:24scala> file.collectres2: Array[String] = Array(hello go, java, c mysql, "", "")
你可能感兴趣的文章
Netbeans 8.1启动参数配置
查看>>
NetBeans IDE8.0需要JDK1.7及以上版本
查看>>
NetBeans之JSP开发环境的搭建...
查看>>
NetBeans之改变难看的JSP脚本标签的背景色...
查看>>
netbeans生成的maven工程没有web.xml文件 如何新建
查看>>
netcat的端口转发功能的实现
查看>>
Netem功能
查看>>
netfilter应用场景
查看>>
Netflix:当你按下“播放”的时候发生了什么?
查看>>
Netflix推荐系统:从评分预测到消费者法则
查看>>
netframework 4.0内置处理JSON对象
查看>>
Netgear WN604 downloadFile.php 信息泄露漏洞复现(CVE-2024-6646)
查看>>
Netgear wndr3700v2 路由器刷OpenWrt打造全能服务器(十一)备份
查看>>
netlink2.6.32内核实现源码
查看>>
netmiko 自动判断设备类型python_Python netmiko模块的使用
查看>>
NetMizer 日志管理系统 多处前台RCE漏洞复现
查看>>
NetMizer-日志管理系统 dologin.php SQL注入漏洞复现(XVE-2024-37672)
查看>>
Netpas:不一样的SD-WAN+ 保障网络通讯品质
查看>>
netron工具简单使用
查看>>
NetScaler MPX Gateway Configuration
查看>>