hadoop2.6.1源码编译64位
一、 问题Apache官网上提供的hadoop本地库是32位的,如果我们的Linux服务器是64位的话,就会出现问题。
我们在64位服务器执行hadoop命令时,则会报以下错误:
WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable
原因是hadoop-2.6.0.tar.gz安装包是在32位机器上编译的,64位的机器加载本地库.so文件时出错,不影响使用。
为了解决上述问题,我们就需要自己编译一个64位的hadoop版本。
二、编译hadoop2.6.1需要的软件
[*] jdk 1.7
[*] gcc 4.4.5 | gcc-c++
[*] maven 3.3.3
[*] protobuf 2.5.0 (google序列化工具)
[*] cmake 2.8.12.2
[*] make
[*] ant 1.9.6
[*] finbugs(可选择)
注意:
finbugs不是编译所必须的软件,可以不下载。
三、编译软件的准备工作
1.jdk的安装
[*] 解压 tar -zxvf jdk-7u79-linux-x64.tar.gz
[*] 配置环境变量,编辑/etc/profile文件
[*] export JAVA_HOME=/opt/jdk1.7.0_25
[*] export CLASSPATH=.:$JAVA_HOME/jre/lib/rt.jar:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
[*] export PATH=$PATH:$JAVA_HOME/bin
[*] source /etc/profile
[*] java -version检查jdk是否安装成功。
2.gcc的安装
一般linux上会自带了gcc的安装,所以在安装以前,先检查一下服务器上是否已经安装了gcc。
输入:gcc -v
如果有以下输出,则这说明已经安装了gcc
yum install gcc
yum install gcc-c++
3.maven的安装
[*] 解压tar -zxvf apache-maven-3.3.3-bin.tar.gz
[*] 配置环境变量,编辑/etc/profile
[*] export PATH=$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin
[*] source /etc/profile
[*] mvn -version检查maven是否安装成功。
4.protobuf的安装
[*] 解压tar -zxvf protobuf-2.5.0.tar.gz
[*] 进入protobuf的解压目录。如:cd /opt/protobuf-2.5.0/
[*] 在安装目录下,执行以下命令:
[*] ./ configure –prefix=/opt/protobuf
[*] make
[*] make check
[*] make install
[*] 编译成功后 export PATH= /opt/protobuf/bin:$PATH
[*] protoc –version检查protoc是否安装成功。
5.cmake的安装
安装 yum install cmake
安装 yum install openssl-devel
安装 yum install ncurses-devel
或者
[*] tar -zxvf cmake-2.8.12.2.tar.gz
[*] 进入cmake的解压目录。如:cd /opt/cmake-2.8.12.2/
[*] 在安装目录下,执行以下命令:
[*] ./ bootstrap
[*] make
[*] make install
[*] cmake -version 检查cmake是否安装成功。
6.make的安装
yum install make
验证: make --version
7.ant的安装
[*] 解压tar -zxvf apache-ant-1.9.6-bin.tar.gz
[*] 编辑环境变量,编辑/etc/profile
[*] export ANT_HOME=/opt/apache-ant-1.9.6
[*] export PATH=$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin:$ANT_HOME/bin
[*] source /etc/profile 刷新修改的环境变量
[*] ant -version检查ant是否安装成功。
8.安装必要的包
[*] 安装 autotool
[*] 执行命令:yum install autoconf automake libtool
9.hadoop-common-project中的pom.xml添加依赖(hadoop-2.2.0需要修改,hadoop2.6.0版本不需要)
<dependency>
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty-util</artifactId>
<scope>test</scope>
</dependency>
10.在编译之前防止 java.lang.OutOfMemoryError: Java heap space 堆栈问题
执行系统命令:
export MAVEN_OPTS="-Xms256m -Xmx512m"
四、编译hadoop2.6.1
[*] 在Apache官网上,下载hadoop-2.6.1的源码包hadoop-2.6.1-src.tar.gz。
[*] 解压源码包tar zxvf hadoop-2.6.1-src.tar.gz
[*] 进入hadoop-2.6.1-src解压目录。cd /opt/hadoop-2.6.1-src/
[*] 执行命令mvn clean package -Pdist,native -DskipTests -Dtar 进行编译。
[*] 编译过程中,需要下载很多包,等待时间比较长。当看到hadoop各个项目都编译成功,即出现一系列的SUCCESS之后,即为编译成功。
[*] 编译好的安装包hadoop-2.6.1.tar.gz,可以在文件目录hadoop-2.6.1-src/hadoop-dist/target/下找到。
五、注意事项
编译过程中需要下载安装包,有时候可能由于网络的原因,导致安装包下载不完整,而出现编译错误。
错误1:
Remote host closed connection during handshake: SSL peer shut down incorrectly.......
解决方案:需要重新新多编译几次即可通过。
=====================================================================
编译日志:
Reactor Summary: Apache Hadoop Main ................................. SUCCESS [ 13.582 s]
Apache Hadoop Project POM .......................... SUCCESS [ 9.846 s]
Apache Hadoop Annotations .......................... SUCCESS [ 24.408 s]
Apache Hadoop Assemblies ........................... SUCCESS [ 1.967 s]
Apache Hadoop Project Dist POM ..................... SUCCESS [ 6.443 s]
Apache Hadoop Maven Plugins ........................ SUCCESS [ 20.692 s]
Apache Hadoop MiniKDC .............................. SUCCESS [ 14.250 s]
Apache Hadoop Auth ................................. SUCCESS [ 23.716 s]
Apache Hadoop Auth Examples ........................ SUCCESS [ 13.714 s]
Apache Hadoop Common ............................... SUCCESS
Apache Hadoop NFS .................................. SUCCESS [ 47.127 s]
Apache Hadoop KMS .................................. SUCCESS [ 48.790 s]
Apache Hadoop Common Project ....................... SUCCESS [ 0.316 s]
Apache Hadoop HDFS ................................. SUCCESS
Apache Hadoop HttpFS ............................... SUCCESS
Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS
Apache Hadoop HDFS-NFS ............................. SUCCESS [ 27.438 s]
Apache Hadoop HDFS Project ......................... SUCCESS [ 0.146 s]
hadoop-yarn ........................................ SUCCESS [ 0.165 s]
hadoop-yarn-api .................................... SUCCESS
hadoop-yarn-common ................................. SUCCESS
hadoop-yarn-server ................................. SUCCESS [ 0.827 s]
hadoop-yarn-server-common .......................... SUCCESS
hadoop-yarn-server-nodemanager ..................... SUCCESS
hadoop-yarn-server-web-proxy ....................... SUCCESS [ 17.129 s]
hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 39.350 s]
hadoop-yarn-server-resourcemanager ................. SUCCESS
hadoop-yarn-server-tests ........................... SUCCESS [ 32.941 s]
hadoop-yarn-client ................................. SUCCESS [ 44.664 s]
hadoop-yarn-applications ........................... SUCCESS [ 0.197 s]
hadoop-yarn-applications-distributedshell .......... SUCCESS [ 15.165 s]
hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 9.604 s]
hadoop-yarn-site ................................... SUCCESS [ 0.149 s]
hadoop-yarn-registry ............................... SUCCESS [ 31.971 s]
hadoop-yarn-project ................................ SUCCESS [ 22.195 s]
hadoop-mapreduce-client ............................ SUCCESS [ 0.673 s]
hadoop-mapreduce-client-core ....................... SUCCESS
hadoop-mapreduce-client-common ..................... SUCCESS
hadoop-mapreduce-client-shuffle .................... SUCCESS [ 24.796 s]
hadoop-mapreduce-client-app ........................ SUCCESS
hadoop-mapreduce-client-hs ......................... SUCCESS [ 43.043 s]
hadoop-mapreduce-client-jobclient .................. SUCCESS
hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 9.662 s]
Apache Hadoop MapReduce Examples ................... SUCCESS [ 40.439 s]
hadoop-mapreduce ................................... SUCCESS [ 13.894 s]
Apache Hadoop MapReduce Streaming .................. SUCCESS [ 32.797 s]
Apache Hadoop Distributed Copy ..................... SUCCESS
Apache Hadoop Archives ............................. SUCCESS [ 11.333 s]
Apache Hadoop Rumen ................................ SUCCESS [ 35.122 s]
Apache Hadoop Gridmix .............................. SUCCESS [ 22.939 s]
Apache Hadoop Data Join ............................ SUCCESS [ 17.568 s]
Apache Hadoop Ant Tasks ............................ SUCCESS [ 12.339 s]
Apache Hadoop Extras ............................... SUCCESS [ 18.325 s]
Apache Hadoop Pipes ................................ SUCCESS [ 27.889 s]
Apache Hadoop OpenStack support .................... SUCCESS [ 30.148 s]
Apache Hadoop Amazon Web Services support .......... SUCCESS
Apache Hadoop Client ............................... SUCCESS [ 25.086 s]
Apache Hadoop Mini-Cluster ......................... SUCCESS [ 0.657 s]
Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 25.302 s]
Apache Hadoop Tools Dist ........................... SUCCESS [ 23.268 s]
Apache Hadoop Tools ................................ SUCCESS [ 0.156 s]
Apache Hadoop Distribution ......................... SUCCESS
------------------------------------------------------------------------
BUILD SUCCESS ------------------------------------------------------------------------
Total time: 01:17 h Finished at: 2014-12-29T20:45:54-08:00
Final Memory: 94M/193M ------------------------------------------------------------------------
#
页:
[1]