设为首页 收藏本站
查看: 612|回复: 0

[经验分享] Spark编译安装和运行

[复制链接]

尚未签到

发表于 2017-2-28 10:19:32 | 显示全部楼层 |阅读模式
  一、环境说明



Mac OSX 10.10.3
Java  1.7.0_71
Spark 1.4.0
  二、编译安装



tar -zxvf spark-1.4.0.tgz
cd spark-1.4.0
./sbt/sbt assembly
  ps:如果之前执行过编译,需要执行 ./sbt/sbt clean  清理后才能重新编译。
  三、运行



adeMacBook-Pro:spark-1.4.0 apple$ ./bin/spark-shell
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/06/14 11:32:25 INFO SecurityManager: Changing view acls to: apple
15/06/14 11:32:25 INFO SecurityManager: Changing modify acls to: apple
15/06/14 11:32:25 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(apple); users with modify permissions: Set(apple)
15/06/14 11:32:25 INFO HttpServer: Starting HTTP Server
15/06/14 11:32:26 INFO Server: jetty-8.y.z-SNAPSHOT
15/06/14 11:32:26 INFO AbstractConnector: Started SocketConnector@0.0.0.0:61566
15/06/14 11:32:26 INFO Utils: Successfully started service 'HTTP class server' on port 61566.
Welcome to
____              __
/ __/__  ___ _____/ /__
_\ \/ _ \/ _ `/ __/  '_/
/___/ .__/\_,_/_/ /_/\_\   version 1.4.0
/_/
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_71)
Type in expressions to have them evaluated.
Type :help for more information.
15/06/14 11:32:31 INFO SparkContext: Running Spark version 1.4.0
15/06/14 11:32:31 INFO SecurityManager: Changing view acls to: apple
15/06/14 11:32:31 INFO SecurityManager: Changing modify acls to: apple
15/06/14 11:32:31 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(apple); users with modify permissions: Set(apple)
15/06/14 11:32:31 INFO Slf4jLogger: Slf4jLogger started
15/06/14 11:32:31 INFO Remoting: Starting remoting
15/06/14 11:32:32 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.1.106:61567]
15/06/14 11:32:32 INFO Utils: Successfully started service 'sparkDriver' on port 61567.
15/06/14 11:32:32 INFO SparkEnv: Registering MapOutputTracker
15/06/14 11:32:32 INFO SparkEnv: Registering BlockManagerMaster
15/06/14 11:32:32 INFO DiskBlockManager: Created local directory at /private/var/folders/s3/llfgz_mx47572r5b4pbk7xm80000gp/T/spark-cf6feb6b-1464-4d54-89f3-8d97bf15205f/blockmgr-b8410cda-aa29-4069-9406-d6155512cd53
15/06/14 11:32:32 INFO MemoryStore: MemoryStore started with capacity 265.4 MB
15/06/14 11:32:32 INFO HttpFileServer: HTTP File server directory is /private/var/folders/s3/llfgz_mx47572r5b4pbk7xm80000gp/T/spark-cf6feb6b-1464-4d54-89f3-8d97bf15205f/httpd-a1838f08-2ccd-42d2-9419-6e91cb6fdfad
15/06/14 11:32:32 INFO HttpServer: Starting HTTP Server
15/06/14 11:32:32 INFO Server: jetty-8.y.z-SNAPSHOT
15/06/14 11:32:32 INFO AbstractConnector: Started SocketConnector@0.0.0.0:61568
15/06/14 11:32:32 INFO Utils: Successfully started service 'HTTP file server' on port 61568.
15/06/14 11:32:32 INFO SparkEnv: Registering OutputCommitCoordinator
15/06/14 11:32:32 INFO Server: jetty-8.y.z-SNAPSHOT
15/06/14 11:32:32 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
15/06/14 11:32:32 INFO Utils: Successfully started service 'SparkUI' on port 4040.
15/06/14 11:32:32 INFO SparkUI: Started SparkUI at http://192.168.1.106:4040
15/06/14 11:32:32 INFO Executor: Starting executor ID driver on host localhost
15/06/14 11:32:32 INFO Executor: Using REPL class URI: http://192.168.1.106:61566
15/06/14 11:32:32 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 61569.
15/06/14 11:32:32 INFO NettyBlockTransferService: Server created on 61569
15/06/14 11:32:32 INFO BlockManagerMaster: Trying to register BlockManager
15/06/14 11:32:32 INFO BlockManagerMasterEndpoint: Registering block manager localhost:61569 with 265.4 MB RAM, BlockManagerId(driver, localhost, 61569)
15/06/14 11:32:32 INFO BlockManagerMaster: Registered BlockManager
15/06/14 11:32:33 INFO SparkILoop: Created spark context..
Spark context available as sc.
15/06/14 11:32:33 INFO SparkILoop: Created sql context..
SQL context available as sqlContext.
scala>
  参考:
  https://spark.apache.org/docs/latest/
  三、使用spark交互模式



1. 运行./spark-shell.sh
2. scala> val data = Array(1, 2, 3, 4, 5) //产生data

data: Array[Int] = Array(1, 2, 3, 4, 5)
3. scala> val distData = sc.parallelize(data) //将data处理成RDD

distData: spark.RDD[Int] = spark.ParallelCollection@7a0ec850 (显示出的类型为RDD)
4. scala> distData.reduce(_+_) //在RDD上进行运算,对data里面元素进行加和
12/05/10 09:36:20 INFO spark.SparkContext: Starting job...
5. 最后运行得到
12/05/10 09:36:20 INFO spark.SparkContext: Job finished in 0.076729174 s
res2: Int = 15

运维网声明 1、欢迎大家加入本站运维交流群:群②:261659950 群⑤:202807635 群⑦870801961 群⑧679858003
2、本站所有主题由该帖子作者发表,该帖子作者与运维网享有帖子相关版权
3、所有作品的著作权均归原作者享有,请您和我们一样尊重他人的著作权等合法权益。如果您对作品感到满意,请购买正版
4、禁止制作、复制、发布和传播具有反动、淫秽、色情、暴力、凶杀等内容的信息,一经发现立即删除。若您因此触犯法律,一切后果自负,我们对此不承担任何责任
5、所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其内容的准确性、可靠性、正当性、安全性、合法性等负责,亦不承担任何法律责任
6、所有作品仅供您个人学习、研究或欣赏,不得用于商业或者其他用途,否则,一切后果均由您自己承担,我们对此不承担任何法律责任
7、如涉及侵犯版权等问题,请您及时通知我们,我们将立即采取措施予以解决
8、联系人Email:admin@iyunv.com 网址:www.yunweiku.com

所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其承担任何法律责任,如涉及侵犯版权等问题,请您及时通知我们,我们将立即处理,联系人Email:kefu@iyunv.com,QQ:1061981298 本贴地址:https://www.yunweiku.com/thread-348296-1-1.html 上篇帖子: Maven学习日记(二)---MAVEN创建多模块的项目 下篇帖子: Jmeter实现WebSocket协议的接口和性能测试方法
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

扫码加入运维网微信交流群X

扫码加入运维网微信交流群

扫描二维码加入运维网微信交流群,最新一手资源尽在官方微信交流群!快快加入我们吧...

扫描微信二维码查看详情

客服E-mail:kefu@iyunv.com 客服QQ:1061981298


QQ群⑦:运维网交流群⑦ QQ群⑧:运维网交流群⑧ k8s群:运维网kubernetes交流群


提醒:禁止发布任何违反国家法律、法规的言论与图片等内容;本站内容均来自个人观点与网络等信息,非本站认同之观点.


本站大部分资源是网友从网上搜集分享而来,其版权均归原作者及其网站所有,我们尊重他人的合法权益,如有内容侵犯您的合法权益,请及时与我们联系进行核实删除!



合作伙伴: 青云cloud

快速回复 返回顶部 返回列表