设为首页 收藏本站
查看: 1114|回复: 0

hadoop本地库不一致的解决方案

[复制链接]

尚未签到

发表于 2015-11-11 11:13:13 | 显示全部楼层 |阅读模式
15/06/25 00:14:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  解决办法:

增加调试信息

[hadoop@master001 native]$ export HADOOP_ROOT_LOGGER=DEBUG,console
[hadoop@master001 native]$ hadoop fs -text /test/data/origz/access.log.gz
15/06/25 00:44:05 DEBUG util.Shell: setsid exited with exit code 0
15/06/25 00:44:05 DEBUG conf.Configuration: parsing URL jar:file:/usr/hadoop/share/hadoop/common/hadoop-common-2.5.2.jar!/core-default.xml
15/06/25 00:44:05 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@71be98f5
15/06/25 00:44:05 DEBUG conf.Configuration: parsing URL file:/usr/hadoop/etc/hadoop/core-site.xml
15/06/25 00:44:05 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@97e1986
15/06/25 00:44:06 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
15/06/25 00:44:06 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
15/06/25 00:44:06 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])
15/06/25 00:44:06 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
15/06/25 00:44:06 DEBUG security.Groups:  Creating new Groups object
15/06/25 00:44:06 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
15/06/25 00:44:06 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/hadoop/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /usr/hadoop/lib/native/libhadoop.so.1.0.0)
15/06/25 00:44:06 DEBUG util.NativeCodeLoader: java.library.path=/usr/hadoop/lib/native
15/06/25 00:44:06 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/06/25 00:44:06 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Falling back to shell based
15/06/25 00:44:06 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
15/06/25 00:44:06 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
15/06/25 00:44:06 DEBUG security.UserGroupInformation: hadoop login
15/06/25 00:44:06 DEBUG security.UserGroupInformation: hadoop login commit
15/06/25 00:44:06 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: hadoop
15/06/25 00:44:06 DEBUG security.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
15/06/25 00:44:06 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
15/06/25 00:44:06 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
15/06/25 00:44:06 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
15/06/25 00:44:06 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =
15/06/25 00:44:06 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
15/06/25 00:44:06 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@78dd667e
15/06/25 00:44:07 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@60dcc9fe
15/06/25 00:44:07 DEBUG shortcircuit.DomainSocketFactory: Both short-circuit local reads and UNIX domain socket are disabled.
15/06/25 00:44:07 DEBUG ipc.Client: The ping interval is 60000 ms.
15/06/25 00:44:07 DEBUG ipc.Client: Connecting to master001/192.168.75.155:8020
15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop: starting, having connections 1
15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop sending #0
15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop got value #0
15/06/25 00:44:07 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 71ms
text: `/test/data/origz/access.log.gz': No such file or directory
15/06/25 00:44:07 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@60dcc9fe
15/06/25 00:44:07 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@60dcc9fe
15/06/25 00:44:07 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@60dcc9fe
15/06/25 00:44:07 DEBUG ipc.Client: Stopping client
15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop: closed
15/06/25 00:44:07 DEBUG ipc.Client: IPC Client (905735620) connection to master001/192.168.75.155:8020 from hadoop: stopped, remaining connections 0查看系统的libc版本

[hadoop@master001 native]$ ll /lib64/libc.so.6
lrwxrwxrwx. 1 root root 12 Apr 14 16:14 /lib64/libc.so.6 -> libc-2.12.so显示版本为2.12

到网站http://ftp.gnu.org/gnu/glibc/

下载glibc-2.14.tar.bz2

下载glibc-linuxthreads-2.5.tar.bz2

[hadoop@master001 native]$ tar -jxvf /home/hadoop/software/glibc-2.14.tar.bz2
[hadoop@master001 native]$ cd glibc-2.14/
[hadoop@master001 glibc-2.14]$ tar -jxvf /home/hadoop/software/glibc-linuxthreads-2.5.tar.bz2
[hadoop@master001 glibc-2.14]$ cd .. #必须返回上级目录
[hadoop@master001 native]$ export CFLAGS="-g -O2"           #加上优化开关,否则会出现错误,必须用root用户
[hadoop@master001 native]$ ./glibc-2.14/configure --prefix=/usr --disable-profile --enable-add-ons --with-headers=/usr/include --with-binutils=/usr/bin
[hadoop@master001 native]$ make        #编译,执行很久,可能出错,出错再重新执行
[hadoop@master001 native]$ sudo make install   #安装,必须root用户执行#验证版本是否升级

[hadoop@master001 native]$ ll /lib64/libc.so.6
lrwxrwxrwx 1 root root 12 Jun 25 02:07 /lib64/libc.so.6 -> libc-2.14.so #显示2.14增加调试信息

[hadoop@master001 native]$ export HADOOP_ROOT_LOGGER=DEBUG,console
#显示有下面红色部分,说明本地库不再有错误
[hadoop@master001 native]$ hadoop fs -text /test/data/origz/access.log.gz
15/06/25 02:10:01 DEBUG util.Shell: setsid exited with exit code 0
15/06/25 02:10:01 DEBUG conf.Configuration: parsing URL jar:file:/usr/hadoop/share/hadoop/common/hadoop-common-2.5.2.jar!/core-default.xml
15/06/25 02:10:01 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@71be98f5
15/06/25 02:10:01 DEBUG conf.Configuration: parsing URL file:/usr/hadoop/etc/hadoop/core-site.xml
15/06/25 02:10:01 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@97e1986
15/06/25 02:10:02 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
15/06/25 02:10:02 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
15/06/25 02:10:02 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, about=, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])
15/06/25 02:10:02 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
15/06/25 02:10:02 DEBUG security.Groups:  Creating new Groups object
15/06/25 02:10:02 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
15/06/25 02:10:02 DEBUG util.NativeCodeLoader: <span style=&quot;color:#ff0000;&quot;>Loaded the native-hadoop library</span>
15/06/25 02:10:02 DEBUG security.JniBasedUnixGroupsMapping: Using JniBasedUnixGroupsMapping for Group resolution
15/06/25 02:10:02 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping
15/06/25 02:10:02 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
15/06/25 02:10:02 DEBUG security.UserGroupInformation: hadoop login
15/06/25 02:10:02 DEBUG security.UserGroupInformation: hadoop login commit
15/06/25 02:10:02 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: hadoop
15/06/25 02:10:02 DEBUG security.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
15/06/25 02:10:03 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =
15/06/25 02:10:03 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
15/06/25 02:10:03 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@501edcf1
15/06/25 02:10:03 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@16e7dcfd
15/06/25 02:10:04 DEBUG unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$1@7e499e08: starting with interruptCheckPeriodMs = 60000
15/06/25 02:10:04 DEBUG shortcircuit.DomainSocketFactory: Both short-circuit local reads and UNIX domain socket are disabled.
15/06/25 02:10:04 DEBUG ipc.Client: The ping interval is 60000 ms.
15/06/25 02:10:04 DEBUG ipc.Client: Connecting to master001/192.168.75.155:8020
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop sending #0
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop: starting, having connections 1
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop got value #0
15/06/25 02:10:04 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 122ms
text: `/test/data/origz/access.log.gz': No such file or directory
15/06/25 02:10:04 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@16e7dcfd
15/06/25 02:10:04 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@16e7dcfd
15/06/25 02:10:04 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@16e7dcfd
15/06/25 02:10:04 DEBUG ipc.Client: Stopping client
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop: closed
15/06/25 02:10:04 DEBUG ipc.Client: IPC Client (577405636) connection to master001/192.168.75.155:8020 from hadoop: stopped, remaining connections 0==============================

完成之后,需要将集群重启

[hadoop@master001 ~]$ sh /usr/hadoop/sbin/start-dfs.sh
[hadoop@master001 ~]$ sh /usr/hadoop/sbin/start-yarn.sh
[hadoop@master001 ~]$ hadoop fs -ls /
[hadoop@master001 ~]$ hadoop fs -mkdir /usr
[hadoop@master001 ~]$ hadoop fs -ls /
Found 1 items
drwxr-xr-x   - hadoop supergroup          0 2015-06-25 02:27 /usr
         
版权声明:本文为博主原创文章,未经博主允许不得转载。

运维网声明 1、欢迎大家加入本站运维交流群:群②:261659950 群⑤:202807635 群⑦870801961 群⑧679858003
2、本站所有主题由该帖子作者发表,该帖子作者与运维网享有帖子相关版权
3、所有作品的著作权均归原作者享有,请您和我们一样尊重他人的著作权等合法权益。如果您对作品感到满意,请购买正版
4、禁止制作、复制、发布和传播具有反动、淫秽、色情、暴力、凶杀等内容的信息,一经发现立即删除。若您因此触犯法律,一切后果自负,我们对此不承担任何责任
5、所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其内容的准确性、可靠性、正当性、安全性、合法性等负责,亦不承担任何法律责任
6、所有作品仅供您个人学习、研究或欣赏,不得用于商业或者其他用途,否则,一切后果均由您自己承担,我们对此不承担任何法律责任
7、如涉及侵犯版权等问题,请您及时通知我们,我们将立即采取措施予以解决
8、联系人Email:admin@iyunv.com 网址:www.yunweiku.com

所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其承担任何法律责任,如涉及侵犯版权等问题,请您及时通知我们,我们将立即处理,联系人Email:kefu@iyunv.com,QQ:1061981298 本贴地址:https://www.yunweiku.com/thread-137835-1-1.html 上篇帖子: 使用ganglia监控hadoop及hbase集群 下篇帖子: hadoop 对数据流的压缩和解压缩
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

扫码加入运维网微信交流群X

扫码加入运维网微信交流群

扫描二维码加入运维网微信交流群,最新一手资源尽在官方微信交流群!快快加入我们吧...

扫描微信二维码查看详情

客服E-mail:kefu@iyunv.com 客服QQ:1061981298


QQ群⑦:运维网交流群⑦ QQ群⑧:运维网交流群⑧ k8s群:运维网kubernetes交流群


提醒:禁止发布任何违反国家法律、法规的言论与图片等内容;本站内容均来自个人观点与网络等信息,非本站认同之观点.


本站大部分资源是网友从网上搜集分享而来,其版权均归原作者及其网站所有,我们尊重他人的合法权益,如有内容侵犯您的合法权益,请及时与我们联系进行核实删除!



合作伙伴: 青云cloud

快速回复 返回顶部 返回列表