[单机搭建]ubuntu-hadoop

一、Ubuntu安装ssh。ubuntu16 LTS的自带client版本较高,安装不了server,可以先
apt-get remove openssh-client
sudo apt-get install openssh-client
sudo apt-get install openssh-server

 

二、关闭防火墙
ufw disable

配置ssh免密登录
ssh-keygen -t rsa -P ” -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
chmod 600 ~/.ssh/authorized_keys

记得加上私钥,否则可能会报错 Agent admitted failure to sign using the key
ssh-add ~/.ssh/id_rsa
三、解压压缩包
$ tar zxvf hadoop-2.7.3.tar.gz -C /home/hadoop/
四、配置Hadoop
修改hadoop-env.sh加上JAVA_HOME
JAVA_HOME=/usr/local/jdk1.7.0_79

修改core-site.xml,参考如下:
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://127.0.0.1:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/usr/hadoop/tmp/</value>
<description>A base for other temporary directories.</description>
</property>
</configuration>

修改hdfs-site.xml 参考如下:
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>

配置环境变量 vi ~/.profile参考如下:
##############
### Env — HD
##############
export HADOOP_HOME=/home/hadoop/hadoop-2.7.3
export PATH=$HADOOP_HOME/bin:$PATH
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export YARN_CONF_DIR=$HADOOP_HOME/etc/hadoop

######Hive
###############
export HIVE_HOME=/home/hadoop/BigData/apache-hive-2.1.1-bin
export PATH=${HIVE_HOME}/bin:$PATH
export hive_dependency=#{HIVE_HOME}/conf:#{HIVE_HOME}/lib/*:#{HIVE_HOME}/hcatalog/share/hcatalog/hive-hcatalog-core-2.1.1.jar

export HBASE_HOME=/home/hadoop/BigData/hbase-1.2.4
export PATH=${HBASE_HOME}/bin:$PATH

 

五、然后格式化hdfs,如下:
hdfs namenode -format

分类上一篇:无,已是最新文章    分类下一篇:

Leave a Reply