Hadoop单机模式安装方法深入剖析
本节和大家一起学习一下Hadoop单机模式安装方法,Hadoop单机模式安装主要有九个步骤,相信通过本节的介绍大家对Hadoop的安装方法有进一步的了解,让我们一起来学习Hadoop单机模式吧。
Hadoop单机模式安装
实验平台:Ubuntu9.04,Hadoop0.20,JDK1.6
step1.ssh的安装设置
由于Hadoop用ssh通信,因此先进行免密码登录设定,
$apt-getinstallssh
$ssh-keygen-trsa-P''-f~/.ssh/id_rsa
$cat~/.ssh/id_rsa.pub>>~/.ssh/authorized_keys
$sshlocalhost
完成后请登入确认不用输入密码,(第一次登入需按enter键,第二次就可以直接登入到系统。
~$sshlocalhost
~$exit
~$sshlocalhost
~$exit
step2.安装java
由于SunJavaRuntime是执行Hadoop必备的工具,因此我们需要安装JRE或JDK。这里我们直接安装JDK,因为后面写程序仍旧需要用到JDK所提供的编译程序。目前Ubuntu9.04提供的JDK套件最新版本为SunJava(TM)DevelopmentKit(JDK)6,套件名称为sun-java6-jdk。并建议删除原本的「gcj」。
~$sudoapt-getpurgejava-gcj-compat
~$sudoapt-getinstallsun-java6-binsun-java6-jdksun-java6-jre
step3.下载安装Hadoop
?Hadoop单机模式时请至下载Hadoop0.20,并解开压缩文件到/opt路径。
?$tarzxvfhadoop-0.20.0.tar.gz
?$sudomvhadoop-0.20.0/opt/
?$sudochown-Rhadoop:hadoop/opt/hadoop-0.20.0
?$sudoln-sf/opt/hadoop-0.20.0/opt/hadoop
step4.设定hadoop-env.sh
?进入hadoop目录,做进一步的设定。我们需要修改两个档案,第一个是hadoop-env.sh,需要设定JAVA_HOME,HADOOP_HOME,PATH三个环境变量。
/opt$cdhadoop/
/opt/hadoop$cat>>conf/hadoop-env.sh<<EOF
贴上以下信息
exportJAVA_HOME=/usr/lib/jvm/java-6-sun
exportHADOOP_HOME=/opt/hadoop
exportPATH=$PATH:/opt/hadoop/bin
EOF
step5.设定hadoop配置文件
?/opt/hadoop/conf/core-site.xml <configuration> <property> <name>fs.default.name</name> <value>hdfs://localhost:9000</value> </property> <property> <name>hadoop.tmp.dir</name> <value>/tmp/hadoop/hadoop-${user.name}</value> </property> </configuration> ?/opt/hadoop/conf/hdfs-site.xml <configuration> <property> <name>dfs.replication</name> <value>1</value> </property> </configuration> ?/opt/hadoop/conf/mapred-site.xml <configuration> <property> <name>mapred.job.tracker</name> <value>localhost:9001</value> </property> </configuration>
step6.格式化HDFS
?Hadoop单机模式安装过程中需要格式化HDFS。以上我们已经设定好Hadoop单机测试的环境,接着让我们来启动Hadoop相关服务,格式化namenode,secondarynamenode,tasktracker
?$cd/opt/hadoop
?$source/opt/hadoop/conf/hadoop-env.sh
?$hadoopnamenode-format
step7.启动Hadoop
?接着用start-all.sh来启动所有服务,包含namenode,datanode,
/opt/hadoop$bin/start-all.sh
执行画面如:
startingnamenode,loggingto/opt/hadoop/logs/hadoop-hadooper-namenode-vPro.out
localhost:startingdatanode,loggingto/opt/hadoop/logs/hadoop-hadooper-datanode-vPro.out
localhost:startingsecondarynamenode,loggingto/opt/hadoop/logs/hadoop-hadooper-secondarynamenode-vPro.out
startingjobtracker,loggingto/opt/hadoop/logs/hadoop-hadooper-jobtracker-vPro.out