设为首页 收藏本站
查看: 1550|回复: 0

[经验分享] centos6安装hadoop-hades

[复制链接]

尚未签到

发表于 2018-10-29 11:04:41 | 显示全部楼层 |阅读模式
  Hadoop是一个由Apache基金会所开发的分布式系统基础架构。
  用户可以在不了解分布式底层细节的情况下,开发分布式程序。充分利用集群的威力进行高速运算和存储。
  Hadoop实现了一个分布式文件系统(Hadoop Distributed File System),简称HDFS。HDFS有高容错性的特点,并且设计用来部署在低廉的(low-cost)硬件上;而且它提供高吞吐量(high throughput)来访问应用程序的数据,适合那些有着超大数据集(large data set)的应用程序。HDFS放宽了(relax)POSIX的要求,可以以流的形式访问(streaming access)文件系统中的数据。
  Hadoop的框架最核心的设计就是:HDFS和MapReduce。HDFS为海量的数据提供了存储,则MapReduce为海量的数据提供了计算。
  namenode  192.168.31.243
  datenode  192.168.31.165
  实验环境
  centos6_x64
  实验软件
  jdk-6u31-linux-i586.bin
  hadoop-1.0.0.tar.gz
  软件安装
  yum install -y rsync* openssh*
  yum install -y ld-linux.so.2
  groupadd hadoop
  useradd hadoop -g hadoop
  mkdir /usr/local/hadoop
  mkdir -p /usr/local/java
  service iptables stop
  ssh-keygen -t rsa            192.168.31.243配置 (192.168.31.165配置相同)
  Enter file in which to save the key (/root/.ssh/id_rsa):
  Enter passphrase (empty for no passphrase):
  Enter same passphrase again:

  Your>  Your public key has been saved in /root/.ssh/id_rsa.pub.
  scp -r /root/.ssh/id_rsa.pub 192.168.31.165:/root/.ssh/authorized_keys
  scp -r /root/.ssh/id_rsa.pub 192.168.31.243:/root/.ssh/authorized_keys
  scp -r jdk-6u31-linux-i586.bin hadoop-1.0.0.tar.gz 192.168.31.165:/root/
  mv jdk-6u31-linux-i586.bin /usr/local/java/
  cd /usr/local/java/
  chmod +x jdk-6u31-linux-i586.bin
  ./jdk-6u31-linux-i586.bin
  vim /etc/profile     最后一行追加配置
  # set java environment
  export JAVA_HOME=/usr/local/java/jdk1.6.0_31

  export>  export PATH=$PATH:$JAVA_HOME/bin:$JAVA_HOME/jre/bin
  # set hadoop path
  export HADOOP_HOME=/usr/local/hadoop
  export PATH=$PATH:$HADOOP_HOME/bin
  source /etc/profile
  java -version
  java version "1.6.0_31"
  Java(TM) SE Runtime Environment (build 1.6.0_31-b04)
  Java HotSpot(TM) Client VM (build 20.6-b01, mixed mode, sharing)
  tar zxvf hadoop-1.0.0.tar.gz
  mv hadoop-1.0.0 /usr/local/hadoop
  chown -R hadoop:hadoop /usr/local/hadoop
  ll /usr/local/hadoop/
  drwxr-xr-x 14 hadoop hadoop 4096 Dec 16  2011 hadoop-1.0.0
  cp /usr/local/hadoop/conf/hadoop-env.sh  /usr/local/hadoop/conf/hadoop-env.sh.bak
  vim /usr/local/hadoop/conf/hadoop-env.sh
  # export JAVA_HOME=/usr/lib/j2sdk1.5-sun
  export JAVA_HOME=/usr/local/java/jdk1.6.0_31  修改为
  cp /usr/local/hadoop/conf
  cp core-site.xml hdfs-site.xml mapred-site.xml core-site.xml 这几个文件都备份一下
  vim core-site.xml
  
  
  
                    红色为需要修改的地方
  
  hadoop.tmp.dir
  /usr/local/hadoop/tmp
  A base for other temporary directories.
  
  
  
  fs.default.name
  hdfs://192.168.31.243:9000
  
  
  vim hdfs-site.xml
  
  
  
  
  
  dfs.replication
  3
  
  
  vim mapred-site.xml
  
  
  
  
  
   mapred.job.tracker
  http://192.168.21.243:9001
  
  
  cp masters masters.bak
  vim masters
  localhost
  192.168.31.243
  cp slave slave.bak             192.168.31.165配置
  vim /usr/local/hadoop/conf/slaves
  localhost
  192.168.31.165
  scp -r  core-site.xml hdfs-site.xml mapred-site.xml 192.168.31.165:/usr/local/hadoop/conf/
  /usr/local/hadoop/bin/hadoop namenode -format
  Warning: $HADOOP_HOME is deprecated.
  16/09/21 22:51:13 INFO namenode.NameNode: STARTUP_MSG:
  /************************************************************
  STARTUP_MSG: Starting NameNode
  STARTUP_MSG:   host = java.net.UnknownHostException: centos6: centos6
  STARTUP_MSG:   args = [-format]
  STARTUP_MSG:   version = 1.0.0
  STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1214675; compiled by 'hortonfo' on Thu Dec 15 16:36:35 UTC 2011
  ************************************************************/
  16/09/21 22:51:14 INFO util.GSet: VM type       = 32-bit
  16/09/21 22:51:14 INFO util.GSet: 2% max memory = 19.33375 MB
  16/09/21 22:51:14 INFO util.GSet: capacity      = 2^22 = 4194304 entries
  16/09/21 22:51:14 INFO util.GSet: recommended=4194304, actual=4194304
  16/09/21 22:51:14 INFO namenode.FSNamesystem: fsOwner=root
  16/09/21 22:51:14 INFO namenode.FSNamesystem: supergroup=supergroup
  16/09/21 22:51:14 INFO namenode.FSNamesystem: isPermissionEnabled=true
  16/09/21 22:51:14 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100
  16/09/21 22:51:14 INFO namenode.FSNamesystem: isAccessTokenEnabled=false accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
  16/09/21 22:51:14 INFO namenode.NameNode: Caching file names occuring more than 10 times

  16/09/21 22:51:14 INFO common.Storage: Image file of>  16/09/21 22:51:14 INFO common.Storage: Storage directory /usr/local/hadoop/tmp/dfs/name has been successfully formatted.
  16/09/21 22:51:14 INFO namenode.NameNode: SHUTDOWN_MSG:
  /************************************************************
  SHUTDOWN_MSG: Shutting down NameNode at java.net.UnknownHostException: centos6: centos6
  ************************************************************/
  /usr/local/hadoop/bin/start-all.sh
  Warning: $HADOOP_HOME is deprecated.
  starting namenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-namenode-centos6.out
  The authenticity of host 'localhost (::1)' can't be established.
  RSA key fingerprint is 81:d9:c6:54:a9:99:27:c0:f7:5f:c3:15:d5:84:a0:99.
  Are you sure you want to continue connecting (yes/no)? yes
  localhost: Warning: Permanently added 'localhost' (RSA) to the list of known hosts.
  root@localhost's password:
  localhost: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-datanode-centos6.out
  The authenticity of host '192.168.31.243 (192.168.31.243)' can't be established.
  RSA key fingerprint is 81:d9:c6:54:a9:99:27:c0:f7:5f:c3:15:d5:84:a0:99.
  Are you sure you want to continue connecting (yes/no)? yes
  192.168.31.243: Warning: Permanently added '192.168.31.243' (RSA) to the list of known hosts.
  root@192.168.31.243's password:
  192.168.31.243: starting secondarynamenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-secondarynamenode-centos6.out
  starting jobtracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-jobtracker-centos6.out
  root@localhost's password:
  localhost: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-tasktracker-centos6.out
  ll /usr/local/hadoop/tmp/
  drwxr-xr-x 5 root root 4096 Sep 21 22:53 dfs
  drwxr-xr-x 3 root root 4096 Sep 21 22:53 mapred  看到这两项证明没有错误
  jps
  3237 SecondaryNameNode
  3011 NameNode
  3467 Jps
  netstat -tuplna | grep 500
  tcp        0      0 :::50070                    :::*                        LISTEN      3011/java
  tcp        0      0 :::50090                    :::*                        LISTEN      3237/java
  http://192.168.31.243:50070/dfshealth.jsp
DSC0000.jpg



运维网声明 1、欢迎大家加入本站运维交流群:群②:261659950 群⑤:202807635 群⑦870801961 群⑧679858003
2、本站所有主题由该帖子作者发表,该帖子作者与运维网享有帖子相关版权
3、所有作品的著作权均归原作者享有,请您和我们一样尊重他人的著作权等合法权益。如果您对作品感到满意,请购买正版
4、禁止制作、复制、发布和传播具有反动、淫秽、色情、暴力、凶杀等内容的信息,一经发现立即删除。若您因此触犯法律,一切后果自负,我们对此不承担任何责任
5、所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其内容的准确性、可靠性、正当性、安全性、合法性等负责,亦不承担任何法律责任
6、所有作品仅供您个人学习、研究或欣赏,不得用于商业或者其他用途,否则,一切后果均由您自己承担,我们对此不承担任何法律责任
7、如涉及侵犯版权等问题,请您及时通知我们,我们将立即采取措施予以解决
8、联系人Email:admin@iyunv.com 网址:www.yunweiku.com

所有资源均系网友上传或者通过网络收集,我们仅提供一个展示、介绍、观摩学习的平台,我们不对其承担任何法律责任,如涉及侵犯版权等问题,请您及时通知我们,我们将立即处理,联系人Email:kefu@iyunv.com,QQ:1061981298 本贴地址:https://www.yunweiku.com/thread-627922-1-1.html 上篇帖子: Hadoop的配置文件 下篇帖子: hadoop的HA实现,超详细
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

扫码加入运维网微信交流群X

扫码加入运维网微信交流群

扫描二维码加入运维网微信交流群,最新一手资源尽在官方微信交流群!快快加入我们吧...

扫描微信二维码查看详情

客服E-mail:kefu@iyunv.com 客服QQ:1061981298


QQ群⑦:运维网交流群⑦ QQ群⑧:运维网交流群⑧ k8s群:运维网kubernetes交流群


提醒:禁止发布任何违反国家法律、法规的言论与图片等内容;本站内容均来自个人观点与网络等信息,非本站认同之观点.


本站大部分资源是网友从网上搜集分享而来,其版权均归原作者及其网站所有,我们尊重他人的合法权益,如有内容侵犯您的合法权益,请及时与我们联系进行核实删除!



合作伙伴: 青云cloud

快速回复 返回顶部 返回列表