741057228我QQ 发表于 2015-7-14 07:18:59

使用Git下载Hadoop的到本地Eclipse开发环境

使用Git下载Hadoop的到本地Eclipse开发环境
博客分类:

[*]Hadoop
[*]*n*x
[*]MacBook Air

hadoopgitmaveneclipsejava
问题场景
按照官网http://wiki.apache.org/hadoop/EclipseEnvironment指导,要把Hadoop下载到本地,并构建Eclipse开发环境,只需要三条指令:



Java代码
[*]$ git clone git://git.apache.org/hadoop-common.git
[*]$ mvn install -DskipTests
[*]$ mvn eclipse:eclipse -DdownloadSources=true -DdownloadJavadocs=true

即可,但当我在本地执行完第二条后,报出如下错误日志信息:



Java代码
[*]   
[*] --- maven-antrun-plugin:1.6:run (compile-proto) @ hadoop-common ---
[*] Executing tasks
[*]
[*]main:
[*]    target/compile-proto.sh: line 17: protoc: command not found
[*]    target/compile-proto.sh: line 17: protoc: command not found
[*] ------------------------------------------------------------------------
[*] Reactor Summary:
[*]   
[*] Apache Hadoop Main ................................ SUCCESS
[*] Apache Hadoop Project POM ......................... SUCCESS
[*] Apache Hadoop Annotations ......................... SUCCESS
[*] Apache Hadoop Project Dist POM .................... SUCCESS
[*] Apache Hadoop Assemblies .......................... SUCCESS
[*] Apache Hadoop Auth ................................ SUCCESS
[*] Apache Hadoop Auth Examples ....................... SUCCESS
[*] Apache Hadoop Common .............................. FAILURE
[*] Apache Hadoop Common Project ...................... SKIPPED
[*] Apache Hadoop HDFS ................................ SKIPPED
[*] Apache Hadoop HttpFS .............................. SKIPPED
[*] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[*] Apache Hadoop HDFS Project ........................ SKIPPED
[*] hadoop-yarn ....................................... SKIPPED
[*] hadoop-yarn-api ................................... SKIPPED
[*] hadoop-yarn-common ................................ SKIPPED
[*] hadoop-yarn-server ................................ SKIPPED
[*] hadoop-yarn-server-common ......................... SKIPPED
[*] hadoop-yarn-server-nodemanager .................... SKIPPED
[*] hadoop-yarn-server-web-proxy ...................... SKIPPED
[*] hadoop-yarn-server-resourcemanager ................ SKIPPED
[*] hadoop-yarn-server-tests .......................... SKIPPED
[*] hadoop-mapreduce-client ........................... SKIPPED
[*] hadoop-mapreduce-client-core ...................... SKIPPED
[*] hadoop-yarn-applications .......................... SKIPPED
[*] hadoop-yarn-applications-distributedshell ......... SKIPPED
[*] hadoop-yarn-site .................................. SKIPPED
[*] hadoop-mapreduce-client-common .................... SKIPPED
[*] hadoop-mapreduce-client-shuffle ................... SKIPPED
[*] hadoop-mapreduce-client-app ....................... SKIPPED
[*] hadoop-mapreduce-client-hs ........................ SKIPPED
[*] hadoop-mapreduce-client-jobclient ................. SKIPPED
[*] Apache Hadoop MapReduce Examples .................. SKIPPED
[*] hadoop-mapreduce .................................. SKIPPED
[*] Apache Hadoop MapReduce Streaming ................. SKIPPED
[*] Apache Hadoop Distributed Copy .................... SKIPPED
[*] Apache Hadoop Archives ............................ SKIPPED
[*] Apache Hadoop Rumen ............................... SKIPPED
[*] Apache Hadoop Extras .............................. SKIPPED
[*] Apache Hadoop Tools Dist .......................... SKIPPED
[*] Apache Hadoop Tools ............................... SKIPPED
[*] Apache Hadoop Distribution ........................ SKIPPED
[*] ------------------------------------------------------------------------
[*] BUILD FAILURE
[*] ------------------------------------------------------------------------
[*] Total time: 12.483s
[*] Finished at: Mon Jan 30 22:57:23 GMT+08:00 2012
[*] Final Memory: 24M/81M
[*] ------------------------------------------------------------------------
[*] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127 ->
[*]   
[*] To see the full stack trace of the errors, re-run Maven with the -e switch.
[*] Re-run Maven using the -X switch to enable full debug logging.
[*]   
[*] For more information about the errors and possible solutions, please read the following articles:
[*] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[*]   
[*] After correcting the problems, you can resume the build with the command
[*]   mvn-rf :hadoop-common

此问题我暂时还未分析具体原因和解决方案,暂时记录下。
展开分析
通过再此使用命令打出错误信息:



Java代码
[*]$ mvn install -DskipTests -e

得到详细错误信息为:



Java代码
[*] ------------------------------------------------------------------------
[*] BUILD FAILURE
[*] ------------------------------------------------------------------------
[*] Total time: 9.387s
[*] Finished at: Mon Jan 30 23:11:07 GMT+08:00 2012
[*] Final Memory: 19M/81M
[*] ------------------------------------------------------------------------
[*] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127 ->
[*]org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127
[*]    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)
[*]    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
[*]    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
[*]    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
[*]    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
[*]    at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
[*]    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
[*]    at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:319)
[*]    at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
[*]    at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
[*]    at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
[*]    at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
[*]    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[*]    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[*]    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[*]    at java.lang.reflect.Method.invoke(Method.java:597)
[*]    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
[*]    at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
[*]    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
[*]    at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
[*]Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: exec returned: 127
[*]    at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:283)
[*]    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
[*]    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
[*]    ... 19 more
[*]Caused by: /Users/apple/Documents/Hadoop-common-dev/hadoop-common/hadoop-common-project/hadoop-common/target/antrun/build-main.xml:23: exec returned: 127
[*]    at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:650)
[*]    at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:676)
[*]    at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:502)
[*]    at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
[*]    at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source)
[*]    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[*]    at java.lang.reflect.Method.invoke(Method.java:597)
[*]    at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
[*]    at org.apache.tools.ant.Task.perform(Task.java:348)
[*]    at org.apache.tools.ant.Target.execute(Target.java:390)
[*]    at org.apache.tools.ant.Target.performTasks(Target.java:411)
[*]    at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1397)
[*]    at org.apache.tools.ant.Project.executeTarget(Project.java:1366)
[*]    at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:270)
[*]    ... 21 more
[*]   
[*] Re-run Maven using the -X switch to enable full debug logging.
[*]   
[*] For more information about the errors and possible solutions, please read the following articles:
[*] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[*]   

通过上面错误信息,真方便找到解决方案。
根据上面的提示,访问https://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
得到如下提示信息:
      Unlike many other errors, this exception is not generated by the Maven core itself but by a plugin. As a rule of thumb, plugins use this error to signal a problem in their configuration or the information they retrieved from the POM.
      这里说的意思是:
      这个错误不是Maven本身的错误,根据经验,可能是Maven使用的插件通过这个异常来标识它们没有从POM中获得相关的配置信息。
接下来进一步分析,通过Maven构建Hadoop过程中是否使用了插件。
      从错误日志分析,编译过程使用了插件:maven-antrun-plugin。由于编译Hadoop-common过程中出错,所以进一步定位到hadoop-common工程下的POM.xml,可到看到下面信息:



Xml代码
[*]
[*]      org.apache.maven.plugins
[*]      maven-antrun-plugin
[*]         
[*]            
[*]            compile-proto
[*]            generate-sources
[*]            
[*]            run
[*]            
[*]            
[*]               
[*]                  
[*]                  PROTO_DIR=src/main/proto
[*]                  JAVA_DIR=target/generated-sources/java
[*]                  which cygpath 2> /dev/null
[*]                  if [ $? = 1 ]; then
[*]                      IS_WIN=false
[*]                  else
[*]                      IS_WIN=true
[*]                      WIN_PROTO_DIR=`cygpath --windows $PROTO_DIR`
[*]                      WIN_JAVA_DIR=`cygpath --windows $JAVA_DIR`
[*]                  fi
[*]                  mkdir -p $JAVA_DIR 2> /dev/null
[*]                  for PROTO_FILE in `ls $PROTO_DIR/*.proto 2> /dev/null`
[*]                  do
[*]                        if [ "$IS_WIN" = "true" ]; then
[*]                        protoc -I$WIN_PROTO_DIR --java_out=$WIN_JAVA_DIR $PROTO_FILE
[*]                        else
[*]                        protoc -I$PROTO_DIR --java_out=$JAVA_DIR $PROTO_FILE
[*]                        fi
[*]                  done
[*]                  
[*]                  
[*]                  
[*]                  
[*]               
[*]            
[*]            
[*]            
[*]            compile-test-proto
[*]            generate-test-sources
[*]            
[*]            run
[*]            
[*]            
[*]               
[*]                  
[*]                  PROTO_DIR=src/test/proto
[*]                  JAVA_DIR=target/generated-test-sources/java
[*]                  which cygpath 2> /dev/null
[*]                  if [ $? = 1 ]; then
[*]                      IS_WIN=false
[*]                  else
[*]                      IS_WIN=true
[*]                      WIN_PROTO_DIR=`cygpath --windows $PROTO_DIR`
[*]                      WIN_JAVA_DIR=`cygpath --windows $JAVA_DIR`
[*]                  fi
[*]                  mkdir -p $JAVA_DIR 2> /dev/null
[*]                  for PROTO_FILE in `ls $PROTO_DIR/*.proto 2> /dev/null`
[*]                  do
[*]                        if [ "$IS_WIN" = "true" ]; then
[*]                        protoc -I$WIN_PROTO_DIR --java_out=$WIN_JAVA_DIR $PROTO_FILE
[*]                        else
[*]                        protoc -I$PROTO_DIR --java_out=$JAVA_DIR $PROTO_FILE
[*]                        fi
[*]                  done
[*]                  
[*]                  
[*]                  
[*]                  
[*]               
[*]            
[*]            
[*]            
[*]            save-version
[*]            generate-sources
[*]            
[*]            run
[*]            
[*]            
[*]               
[*]                  
[*]                  
[*]                  
[*]                  
[*]               
[*]            
[*]            
[*]            
[*]            generate-test-sources
[*]            generate-test-sources
[*]            
[*]            run
[*]            
[*]            
[*]               
[*]
[*]                  
[*]
[*]                  
[*]                  
[*]                  
[*]                  
[*]                  
[*]                  
[*]               
[*]            
[*]            
[*]            
[*]            create-log-dir
[*]            process-test-resources
[*]            
[*]            run
[*]            
[*]            
[*]               
[*]                  
[*]                  
[*]                  
[*]                  
[*]
[*]                  
[*]                  
[*]                  
[*]               
[*]            
[*]            
[*]            
[*]            pre-site
[*]            
[*]            run
[*]            
[*]            
[*]               
[*]                  
[*]                  
[*]               
[*]            
[*]            
[*]         
[*]      

上面是在Pom.xml文件中使用Maven下使用Ant插件,里面有一行:



Xml代码
[*]

看着让人不解,联系到HowToContribueToHadoop的文章http://wiki.apache.org/hadoop/HowToContribute可以推知,有可能由于本地没有安装ProtocolBuffers引起的,因为文章内部特别说明了:
引用
Hadoop 0.23+ must have Google's ProtocolBuffers for compilation to work.接下来打算在本地重新安装ProtocolBuffers后再编译部署。

如预期,在本地安装好了Protoc Buffer后,后面两条指令顺利执行完整,剩下的就依据官网把目录下的工程导入Eclipse后,就可以在Eclipse下学习调试源码。
页: [1]
查看完整版本: 使用Git下载Hadoop的到本地Eclipse开发环境