转自:http://www.micmiu.com/bigdata/hadoop/hadoop-build-native-library-on-mac-os-x/
查阅有关官方介绍 http://wiki.apache.org/hadoop/HowToContribute 中有说明:Hadoop本地库只支持*nix平台,已经广泛使用在GNU/Linux平台上,但是不支持Cygwin 和 Mac OS X 。搜索后发现已经有人给出了Mac OSX 系统下编译生成本地库的patch,下面详细介绍在Mac OSX 平台下编译Hadoop本地库的方法。
[一]、环境说明:
Hadoop 2.2.0
Mac OS X 10.9.1
详细的环境依赖(protoc、cmake 等)参见:Hadoop2.2.0源码编译 (http://www.micmiu.com/opensource/hadoop/hadoop-build-source-2-2-0/)中介绍。
[二]、Mac OSX 编译本地库的步骤:
1、checkout Hadoop 2.2.0的源码
1$svnco https://svn.apache.org/repos/asf/hadoop/common/tags/release-2.2.0/2、patch 相关补丁
官方讨论地址:https://issues.apache.org/jira/browse/HADOOP-9648 里面有详细介绍
补丁下载链接:https://issues.apache.org/jira/secure/attachment/12617363/HADOOP-9648.v2.patch
1#切换到hadoop源码的根目录2$wget https://issues.apache.org/jira/secure/attachment/12617363/HADOOP-9648.v2.patch3$patch-p1 < HADOOP-9648.v2.patchps: 如果要回退patch 执行:patch -RE -p1< HADOOP-9648.v2.patch 即可。
3、编译本地库
在Hadoop源码的根目录下执行编译本地库命令:
1$mvn package -Pdist,native -DskipTests -Dtar编译成功看到如下日志信息:
1[INFO]------------------------------------------------------------------------2[INFO]Reactor Summary:3[INFO]4[INFO]Apache Hadoop Main ................................ SUCCESS [1.511s]5[INFO]Apache Hadoop Project POM ......................... SUCCESS [0.493s]6[INFO]Apache Hadoop Annotations ......................... SUCCESS [0.823s]7[INFO]Apache Hadoop Project Dist POM .................... SUCCESS [0.561s]8[INFO]Apache Hadoop Assemblies .......................... SUCCESS [0.245s]9[INFO]Apache Hadoop Maven Plugins ....................... SUCCESS [2.465s]10[INFO]Apache Hadoop MiniKDC ............................. SUCCESS [0.749s]11[INFO]Apache Hadoop Auth ................................ SUCCESS [0.832s]12[INFO]Apache Hadoop Auth Examples ....................... SUCCESS [2.070s]13[INFO]Apache Hadoop Common .............................. SUCCESS [1:00.030s]14[INFO]Apache Hadoop NFS ................................. SUCCESS [0.285s]15[INFO]Apache Hadoop Common Project ...................... SUCCESS [0.049s]16[INFO]Apache Hadoop HDFS ................................ SUCCESS [1:13.339s]17[INFO]Apache Hadoop HttpFS .............................. SUCCESS [20.259s]18[INFO]Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [0.767s]19[INFO]Apache Hadoop HDFS-NFS ............................ SUCCESS [0.279s]20[INFO]Apache Hadoop HDFS Project ........................ SUCCESS [0.046s]21[INFO]hadoop-yarn ....................................... SUCCESS [0.239s]22[INFO]hadoop-yarn-api ................................... SUCCESS [7.641s]23[INFO]hadoop-yarn-common ................................ SUCCESS [5.479s]24[INFO]hadoop-yarn-server ................................ SUCCESS [0.114s]25[INFO]hadoop-yarn-server-common ......................... SUCCESS [1.743s]26[INFO]hadoop-yarn-server-nodemanager .................... SUCCESS [6.381s]27[INFO]hadoop-yarn-server-web-proxy ...................... SUCCESS [0.259s]28[INFO]hadoop-yarn-server-resourcemanager ................ SUCCESS [0.578s]29[INFO]hadoop-yarn-server-tests .......................... SUCCESS [0.303s]30[INFO]hadoop-yarn-client ................................ SUCCESS [0.233s]31[INFO]hadoop-yarn-applications .......................... SUCCESS [0.062s]32[INFO]hadoop-yarn-applications-distributedshell ......... SUCCESS [0.253s]33[INFO]hadoop-mapreduce-client ........................... SUCCESS [0.074s]34[INFO]hadoop-mapreduce-client-core ...................... SUCCESS [1.504s]35[INFO]hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [0.242s]36[INFO]hadoop-yarn-site .................................. SUCCESS [0.172s]37[INFO]hadoop-yarn-project ............................... SUCCESS [1.235s]38[INFO]hadoop-mapreduce-client-common .................... SUCCESS [3.664s]39[INFO]hadoop-mapreduce-client-shuffle ................... SUCCESS [0.183s]40[INFO]hadoop-mapreduce-client-app ....................... SUCCESS [0.495s]41[INFO]hadoop-mapreduce-client-hs ........................ SUCCESS [1.296s]42[INFO]hadoop-mapreduce-client-jobclient ................. SUCCESS [0.580s]43[INFO]hadoop-mapreduce-client-hs-plugins ................ SUCCESS [0.213s]44[INFO]Apache Hadoop MapReduce Examples .................. SUCCESS [0.344s]45[INFO]hadoop-mapreduce .................................. SUCCESS [1.303s]46[INFO]Apache Hadoop MapReduce Streaming ................. SUCCESS [0.257s]47[INFO]Apache Hadoop Distributed Copy .................... SUCCESS [9.925s]48[INFO]Apache Hadoop Archives ............................ SUCCESS [0.282s]49[INFO]Apache Hadoop Rumen ............................... SUCCESS [0.403s]50[INFO]Apache Hadoop Gridmix ............................. SUCCESS [0.283s]51[INFO]Apache Hadoop Data Join ........................... SUCCESS [0.197s]52[INFO]Apache Hadoop Extras .............................. SUCCESS [0.241s]53[INFO]Apache Hadoop Pipes ............................... SUCCESS [8.249s]54[INFO]Apache Hadoop OpenStack support ................... SUCCESS [0.492s]55[INFO]Apache Hadoop Client .............................. SUCCESS [0.373s]56[INFO]Apache Hadoop Mini-Cluster ........................ SUCCESS [0.133s]57[INFO]Apache Hadoop Scheduler Load Simulator ............ SUCCESS [0.439s]58[INFO]Apache Hadoop Tools Dist .......................... SUCCESS [0.596s]59[INFO]Apache Hadoop Tools ............................... SUCCESS [0.044s]60[INFO]Apache Hadoop Distribution ........................ SUCCESS [0.194s]61[INFO]------------------------------------------------------------------------62[INFO]BUILD SUCCESS63[INFO]------------------------------------------------------------------------64[INFO]Total time:3:44.266s65[INFO]Finished at: Fri Jan 17 10:06:17 CST 201466[INFO]Final Memory: 66M/123M67[INFO]------------------------------------------------------------------------68micmiu-mbp:trunkmicmiu$编译通过后可在 <HADOOP源码根目录>/hadoop-dist/target/hadoop-2.2.0/lib/ 目录下看到如下内容:
1micmiu-mbp:libmicmiu$ tree2.3|____.DS_Store4|____native5||____libhadoop.1.0.0.dylib6||____libhadoop.a7||____libhadoop.dylib8||____libhadooppipes.a9||____libhadooputils.a10||____libhdfs.0.0.0.dylib11||____libhdfs.a12||____libhdfs.dylib然后把 上面生成的本地库 copy到部署环境相应的位置,再建立软连接即可:
1$ls -slibhadoop.1.0.0.dylib libhadoop.so2$ls -slibhdfs.0.0.0.dylib libhdfs.so下载地址:http://yun.baidu.com/s/1c0jBZDQ
[三]、参考:
http://wiki.apache.org/hadoop/HowToContribute
http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/NativeLibraries.html
https://issues.apache.org/jira/browse/HADOOP-9648
https://issues.apache.org/jira/browse/HADOOP-3659
运维网声明
1、欢迎大家加入本站运维交流群:群②:261659950 群⑤:202807635 群⑦870801961 群⑧679858003
2、本站所有主题由该帖子作者发表,该帖子作者与运维网 享有帖子相关版权
3、所有作品的著作权均归原作者享有,请您和我们一样尊重他人的著作权等合法权益。如果您对作品感到满意,请购买正版
4、禁止制作、复制、发布和传播具有反动、淫秽、色情、暴力、凶杀等内容的信息,一经发现立即删除。若您因此触犯法律,一切后果自负,我们对此不承担任何责任
5、所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其内容的准确性、可靠性、正当性、安全性、合法性等负责,亦不承担任何法律责任
6、所有作品仅供您个人学习、研究或欣赏,不得用于商业或者其他用途,否则,一切后果均由您自己承担,我们对此不承担任何法律责任
7、如涉及侵犯版权等问题,请您及时通知我们,我们将立即采取措施予以解决
8、联系人Email:admin@iyunv.com 网址:www.yunweiku.com