Hadoop 0.20.2 在Mac OS 10.9 环境下 pseudo distributed 配置

1.下载 Hadoop-0.20.2版本并解压, tar -xvzf

2. edit the file conf/hadoop-env.sh to define at least JAVA_HOME to be the root of your Java installation.

加上这一句 export JAVA_HOME=/Library/Java/Home

3.Try the following command: $ bin/hadoop  This will display the usage documentation for the hadoop script.

4. 更改 conf文件夹里的 配置文件

conf/core-site.xml:

<configuration>
    <property>
        <name>fs.default.name</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>

conf/hdfs-site.xml:

<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
</configuration>

conf/mapred-site.xml:

<configuration>
    <property>
        <name>mapred.job.tracker</name>
        <value>localhost:9001</value>
    </property>
</configuration>

5. 配置ssh

Now check that you can ssh to the localhost without a passphrase:
$ ssh localhost

If you cannot ssh to localhost without a passphrase, execute the following commands:
$ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
$ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys

6. 执行Hadoop

Format a new distributed-filesystem:
$ bin/hadoop namenode -format

Start the hadoop daemons:
$ bin/start-all.sh

The hadoop daemon log output is written to the ${HADOOP_LOG_DIR} directory (defaults to ${HADOOP_HOME}/logs).

Browse the web interface for the NameNode and the JobTracker; by default they are available at:

NameNode - http://localhost:50070/
JobTracker - http://localhost:50030/

7. 运行hadoop 的 hello word 程序

mkdir input

并在里面放置你想统计的文本文件

将文件夹放入hdfs里面

bin/hadoop dfs -put input input

执行example word count 程序 ,输入文件夹为hdfs里面名字为input的,输出文件夹为output

bin/hadoop jar hadoop-0.20.2-examples.jar wordcount input output

将hdfs里面的output文件夹取出到本地命名为output
bin/hadoop -dfs get output output

查看里面的词频统计内容

cat output/*

8. helpful link
http://www.cs.brandeis.edu/~rshaull/cs147a-fall-2008/hadoop-troubleshooting/

里面列出来了一些安装出现的简单问题,譬如当我运行 eamples时遇见过

java.io.IOException: Not a file:
  hdfs://localhost:9000/user/ross/input/conf

里面列出来了是因为在hdfs里面的input文件夹没有删除,我们需要

bin/hadoop dfs -rmr input
bin/hadoop dfs -put conf input

9. 引用

安装流程参考 https://hadoop.apache.org/docs/r1.2.1/single_node_setup.html

相关阅读

相关推荐