设为首页 收藏本站
查看: 755|回复: 0

[经验分享] Hadoop 修改源码以及将修改后的源码应用到部署好的Hadoop中

[复制链接]

尚未签到

发表于 2017-12-17 18:16:33 | 显示全部楼层 |阅读模式
Build instructions for Hadoop  

  
----------------------------------------------------------------------------------
  
Requirements:
  

  
* Unix System
  
* JDK 1.7+
  
* Maven 3.0 or later
  
* Findbugs 1.3.9 (if running findbugs)
  
* ProtocolBuffer 2.5.0
  
* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
  
* Zlib devel (if compiling native code)
  
* openssl devel ( if compiling native hadoop-pipes and to get the best HDFS encryption performance )
  
* Linux FUSE (Filesystem in Userspace) version 2.6 or above ( if compiling fuse_dfs )
  
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
  

  
----------------------------------------------------------------------------------
  
Installing required packages for clean install of Ubuntu 14.04 LTS Desktop:
  

  
* Oracle JDK 1.7 (preferred)
  $ sudo apt-get purge openjdk*
  $ sudo apt-get install software-properties-common
  $ sudo add-apt-repository ppa:webupd8team/java
  $ sudo apt-get update
  $ sudo apt-get install oracle-java7-installer
  
* Maven
  $ sudo apt-get -y install maven
  
* Native libraries
  $ sudo apt-get -y install build-essential autoconf automake libtool cmake zlib1g-dev pkg-config libssl-dev
  
* ProtocolBuffer 2.5.0 (required)
  $ sudo apt-get -y install libprotobuf-dev protobuf-compiler
  

  
Optional packages:
  

  
* Snappy compression
  $ sudo apt-get install snappy libsnappy-dev
  
* Bzip2
  $ sudo apt-get install bzip2 libbz2-dev
  
* Jansson (C Library for JSON)
  $ sudo apt-get install libjansson-dev
  
* Linux FUSE
  $ sudo apt-get install fuse libfuse-dev
  

  
----------------------------------------------------------------------------------
  
Maven main modules:
  

  hadoop                            (Main Hadoop project)
  - hadoop-project           (Parent POM for all Hadoop Maven modules.             )
  (All plugins & dependencies versions are defined here.)
  - hadoop-project-dist      (Parent POM for modules that generate distributions.)
  - hadoop-annotations       (Generates the Hadoop doclet used to generated the Javadocs)
  - hadoop-assemblies        (Maven assemblies used by the different modules)
  - hadoop-common-project    (Hadoop Common)
  - hadoop-hdfs-project      (Hadoop HDFS)
  - hadoop-mapreduce-project (Hadoop MapReduce)
  - hadoop-tools             (Hadoop tools like Streaming, Distcp, etc.)
  - hadoop-dist              (Hadoop distribution assembler)
  

  
----------------------------------------------------------------------------------
  
Where to run Maven from?
  

  It can be run from any module. The only catch is that if not run from utrunk
  all modules that are not part of the build run must be installed in the local
  Maven cache or available in a Maven repository.
  

  
----------------------------------------------------------------------------------
  
Maven build goals:
  

  * Clean                     : mvn clean
  * Compile                   : mvn compile [-Pnative]
  * Run tests                 : mvn test [-Pnative]
  * Create JAR                : mvn package
  * Run findbugs              : mvn compile findbugs:findbugs
  * Run checkstyle            : mvn compile checkstyle:checkstyle
  * Install JAR in M2 cache   : mvn install
  * Deploy JAR to Maven repo  : mvn deploy
  * Run clover                : mvn test -Pclover [-DcloverLicenseLocation=${user.name}/.clover.license]
  * Run Rat                   : mvn apache-rat:check
  * Build javadocs            : mvn javadoc:javadoc
  * Build distribution        : mvn package [-Pdist][-Pdocs][-Psrc][-Pnative][-Dtar]
  * Change Hadoop version     : mvn versions:set -DnewVersion=NEWVERSION
  

  Build options:
  

  * Use -Pnative to compile/bundle native code
  * Use -Pdocs to generate & bundle the documentation in the distribution (using -Pdist)
  * Use -Psrc to create a project source TAR.GZ
  * Use -Dtar to create a TAR with the distribution (using -Pdist)
  

  Snappy build options:
  

  Snappy is a compression library that can be utilized by the native code.
  It is currently an optional component, meaning that Hadoop can be built with
  or without this dependency.
  

  * Use -Drequire.snappy to fail the build if libsnappy.so is not found.
  If this option is not specified and the snappy library is missing,
  we silently build a version of libhadoop.so that cannot make use of snappy.
  This option is recommended if you plan on making use of snappy and want
  to get more repeatable builds.
  

  * Use -Dsnappy.prefix to specify a nonstandard location for the libsnappy
  header files and library files. You do not need this option if you have
  installed snappy using a package manager.
  * Use -Dsnappy.lib to specify a nonstandard location for the libsnappy library
  files.  Similarly to snappy.prefix, you do not need this option if you have
  installed snappy using a package manager.
  * Use -Dbundle.snappy to copy the contents of the snappy.lib directory into
  the final tar file. This option requires that -Dsnappy.lib is also given,
  and it ignores the -Dsnappy.prefix option.
  

  OpenSSL build options:
  

  OpenSSL includes a crypto library that can be utilized by the native code.
  It is currently an optional component, meaning that Hadoop can be built with
  or without this dependency.
  

  * Use -Drequire.openssl to fail the build if libcrypto.so is not found.
  If this option is not specified and the openssl library is missing,
  we silently build a version of libhadoop.so that cannot make use of
  openssl. This option is recommended if you plan on making use of openssl
  and want to get more repeatable builds.
  * Use -Dopenssl.prefix to specify a nonstandard location for the libcrypto
  header files and library files. You do not need this option if you have
  installed openssl using a package manager.
  * Use -Dopenssl.lib to specify a nonstandard location for the libcrypto library
  files. Similarly to openssl.prefix, you do not need this option if you have
  installed openssl using a package manager.
  * Use -Dbundle.openssl to copy the contents of the openssl.lib directory into
  the final tar file. This option requires that -Dopenssl.lib is also given,
  and it ignores the -Dopenssl.prefix option.
  

  Tests options:
  

  * Use -DskipTests to skip tests when running the following Maven goals:
  'package',  'install', 'deploy' or 'verify'
  * -Dtest=<TESTCLASSNAME>,<TESTCLASSNAME#METHODNAME>,....
  * -Dtest.exclude=<TESTCLASSNAME>
  * -Dtest.exclude.pattern=**/<TESTCLASSNAME1>.java,**/<TESTCLASSNAME2>.java
  

  
----------------------------------------------------------------------------------
  
Building components separately
  

  
If you are building a submodule directory, all the hadoop dependencies this
  
submodule has will be resolved as all other 3rd party dependencies. This is,
  
from the Maven cache or from a Maven repository (if not available in the cache
  
or the SNAPSHOT 'timed out').

  
An>  
level once; and then work from the submodule. Keep in mind that SNAPSHOTs
  
time out after a while, using the Maven '-nsu' will stop Maven from trying
  
to update SNAPSHOTs from external repos.
  

  
----------------------------------------------------------------------------------
  
Protocol Buffer compiler
  

  
The version of Protocol Buffer compiler, protoc, must match the version of the
  
protobuf JAR.
  

  
If you have multiple versions of protoc in your system, you can set in your
  
build shell the HADOOP_PROTOC_PATH environment variable to point to the one you
  
want to use for the Hadoop build. If you don't define this environment variable,
  
protoc is looked up in the PATH.
  
----------------------------------------------------------------------------------
  
Importing projects to eclipse
  

  
When you import the project to eclipse, install hadoop-maven-plugins at first.
  

  $ cd hadoop-maven-plugins
  $ mvn install
  

  
Then, generate eclipse project files.
  

  $ mvn eclipse:eclipse -DskipTests
  

  
At last, import to eclipse by specifying the root directory of the project via
  
[File] > [Import] > [Existing Projects into Workspace].
  

  
----------------------------------------------------------------------------------
  
Building distributions:
  

  
Create binary distribution without native code and without documentation:
  

  $ mvn package -Pdist -DskipTests -Dtar
  

  
Create binary distribution with native code and with documentation:
  

  $ mvn package -Pdist,native,docs -DskipTests -Dtar
  

  
Create source distribution:
  

  $ mvn package -Psrc -DskipTests
  

  
Create source and binary distributions with native code and documentation:
  

  $ mvn package -Pdist,native,docs,src -DskipTests -Dtar
  

  
Create a local staging version of the website (in /tmp/hadoop-site)
  

  $ mvn clean site; mvn site:stage -DstagingDirectory=/tmp/hadoop-site
  

  
----------------------------------------------------------------------------------
  
Installing Hadoop
  

  
Look for these HTML files after you build the document by the above commands.
  

  * Single Node Setup:
  hadoop-project-dist/hadoop-common/SingleCluster.html
  

  * Cluster Setup:
  hadoop-project-dist/hadoop-common/ClusterSetup.html
  

  
----------------------------------------------------------------------------------
  

  
Handling out of memory errors in builds
  

  
----------------------------------------------------------------------------------
  

  
If the build process fails with an out of memory error, you should be able to fix
  
it by increasing the memory used by maven -which can be done via the environment
  
variable MAVEN_OPTS.
  

  
Here is an example setting to allocate between 256 and 512 MB of heap space to
  
Maven
  

  
export MAVEN_OPTS="-Xms256m -Xmx512m"
  

  
----------------------------------------------------------------------------------
  

  
Building on Windows
  

  
----------------------------------------------------------------------------------
  
Requirements:
  

  
* Windows System
  
* JDK 1.7+
  
* Maven 3.0 or later
  
* Findbugs 1.3.9 (if running findbugs)
  
* ProtocolBuffer 2.5.0
  
* CMake 2.6 or newer
  
* Windows SDK 7.1 or Visual Studio 2010 Professional
  
* Windows SDK 8.1 (if building CPU rate control for the container executor)
  
* zlib headers (if building native code bindings for zlib)
  
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
  
* Unix command-line tools from GnuWin32: sh, mkdir, rm, cp, tar, gzip. These
  tools must be present on your PATH.
  

  
Unix command-line tools are also included with the Windows Git package which
  
can be downloaded from http://git-scm.com/download/win.
  

  
If using Visual Studio, it must be Visual Studio 2010 Professional (not 2012).
  
Do not use Visual Studio Express.  It does not support compiling for 64-bit,
  
which is problematic if running a 64-bit system.  The Windows SDK 7.1 is free to
  
download here:
  

  
http://www.microsoft.com/en-us/download/details.aspx?id=8279
  

  
The Windows SDK 8.1 is available to download at:
  

  
http://msdn.microsoft.com/en-us/windows/bg162891.aspx
  

  
Cygwin is neither required nor supported.
  

  
----------------------------------------------------------------------------------
  
Building:
  


  
Keep the source code tree in a short path to avoid running into problems>  
to Windows maximum path length limitation.  (For example, C:\hdc).
  

  
Run builds from a Windows SDK Command Prompt.  (Start, All Programs,
  
Microsoft Windows SDK v7.1, Windows SDK 7.1 Command Prompt.)
  

  
JAVA_HOME must be set, and the path must not contain spaces.  If the full path
  
would contain spaces, then use the Windows short path instead.
  

  
You must set the Platform environment variable to either x64 or Win32 depending
  
on whether you're running a 64-bit or 32-bit system.  Note that this is
  
case-sensitive.  It must be "Platform", not "PLATFORM" or "platform".
  
Environment variables on Windows are usually case-insensitive, but Maven treats
  
them as case-sensitive.  Failure to set this environment variable correctly will
  
cause msbuild to fail while building the native code in hadoop-common.
  

  
set Platform=x64 (when building on a 64-bit system)
  
set Platform=Win32 (when building on a 32-bit system)
  

  
Several tests require that the user must have the Create Symbolic Links
  
privilege.
  

  
All Maven goals are the same as described above with the exception that
  
native code is built by enabling the 'native-win' Maven profile. -Pnative-win
  
is enabled by default when building on Windows since the native components
  
are required (not optional) on Windows.
  

  
If native code bindings for zlib are required, then the zlib headers must be
  
deployed on the build machine.  Set the ZLIB_HOME environment variable to the
  
directory containing the headers.
  

  
set ZLIB_HOME=C:\zlib-1.2.7
  

  
At runtime, zlib1.dll must be accessible on the PATH.  Hadoop has been tested
  
with zlib 1.2.7, built using Visual Studio 2010 out of contrib\vstudio\vc10 in
  
the zlib 1.2.7 source tree.
  

  
http://www.zlib.net/
  

  
----------------------------------------------------------------------------------
  
Building distributions:
  

  * Build distribution with native code    : mvn package [-Pdist][-Pdocs][-Psrc][-Dtar]

运维网声明 1、欢迎大家加入本站运维交流群:群②:261659950 群⑤:202807635 群⑦870801961 群⑧679858003
2、本站所有主题由该帖子作者发表,该帖子作者与运维网享有帖子相关版权
3、所有作品的著作权均归原作者享有,请您和我们一样尊重他人的著作权等合法权益。如果您对作品感到满意,请购买正版
4、禁止制作、复制、发布和传播具有反动、淫秽、色情、暴力、凶杀等内容的信息,一经发现立即删除。若您因此触犯法律,一切后果自负,我们对此不承担任何责任
5、所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其内容的准确性、可靠性、正当性、安全性、合法性等负责,亦不承担任何法律责任
6、所有作品仅供您个人学习、研究或欣赏,不得用于商业或者其他用途,否则,一切后果均由您自己承担,我们对此不承担任何法律责任
7、如涉及侵犯版权等问题,请您及时通知我们,我们将立即采取措施予以解决
8、联系人Email:admin@iyunv.com 网址:www.yunweiku.com

所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其承担任何法律责任,如涉及侵犯版权等问题,请您及时通知我们,我们将立即处理,联系人Email:kefu@iyunv.com,QQ:1061981298 本贴地址:https://www.yunweiku.com/thread-425086-1-1.html 上篇帖子: hadoop+kerberos常用运维命令 下篇帖子: 【大数据系列】hadoop核心组件
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

扫码加入运维网微信交流群X

扫码加入运维网微信交流群

扫描二维码加入运维网微信交流群,最新一手资源尽在官方微信交流群!快快加入我们吧...

扫描微信二维码查看详情

客服E-mail:kefu@iyunv.com 客服QQ:1061981298


QQ群⑦:运维网交流群⑦ QQ群⑧:运维网交流群⑧ k8s群:运维网kubernetes交流群


提醒:禁止发布任何违反国家法律、法规的言论与图片等内容;本站内容均来自个人观点与网络等信息,非本站认同之观点.


本站大部分资源是网友从网上搜集分享而来,其版权均归原作者及其网站所有,我们尊重他人的合法权益,如有内容侵犯您的合法权益,请及时与我们联系进行核实删除!



合作伙伴: 青云cloud

快速回复 返回顶部 返回列表