Hadoop分布式配置
1.配置环境变量,要在安装了JDK前提下
export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin
2.配置conf/hadoop-env.sh
export JAVA_HOME=/usr/local/java/jdk1.7.0_45(必须)
export HADOOP_HEAPSIZE=512
export HADOOP_PID_DIR=/home/$USER/pids
3.修改主机名
sudo vi /etc/hostname
4.配置/etc/hosts
192.168.1.110 master
192.168.1.101 slave1
192.168.1.109 slave2
192.168.1.108 slave3
5.修改conf/core-site.xml
<configuration> <property> <name>fs.default.name</name> <value>hdfs://master:9000</value> </property> <property> <name>hadoop.tmp.dir</name> <value>/home/${user.name}/tmp</value> </property> </configuration>
6.修改conf/hdfs-site.xml
<configuration> <property> <name>dfs.replication</name> <value>3</value> </property> <property> <name>dfs.name.dir</name> <value>/home/${user.name}/dfs_name</value> </property> <property> <name>dfs.data.dir</name> <value>/home/${user.name}/dfs_data</value> </property> </configuration>
7.修改conf/mapred-site.xml
<configuration> <property> <name>mapred.job.tracker</name> <value>master:9001</value> </property> <property> <name>mapred.system.dir</name> <value>/home/${user.name}/mapred_system</value> </property> <property> <name>mapred.local.dir</name> <value>/home/${user.name}/mapred_local</value> </property> </configuration>
8.修改conf/masters
master
9.修改conf/salves
slave1
slave2
slave3
相关推荐
changjiang 2020-11-16
minerd 2020-10-28
WeiHHH 2020-09-23
Aleks 2020-08-19
WeiHHH 2020-08-17
飞鸿踏雪0 2020-07-26
tomli 2020-07-26
deyu 2020-07-21
strongyoung 2020-07-19
eternityzzy 2020-07-19
Elmo 2020-07-19
飞鸿踏雪0 2020-07-09
飞鸿踏雪0 2020-07-04
xieting 2020-07-04
WeiHHH 2020-06-28
genshengxiao 2020-06-26
Hhanwen 2020-06-25