Hadoop2.6源码编译
1.安装CentOS我使用的是CentOS6.5,下载地址是http://mirror.neu.edu.cn/centos/6.5/isos/x86_64/,选择CentOS-6.5-x86_64-bin-DVD1.iso 下载,注意是64位的,大小是4GB,需要下载一段时间的。其实6.x的版本都可以,不一定是6.5。我使用的是VMWare虚拟机,分配了2GB内存,20GB磁盘空间
1.安装Rhel6(CentOS6.5)
本人工作测试环境为Rhel6(PC为CentOS6.5),注意!生产环境一般是64位,此编译环境为linux环境64位版本。
我使用的是工作测试用机,4GB内存,500G硬盘,注意!内存太小,会比较慢;磁盘太小,编译时可能会出现空间不足的情况。请根据自己的机器配置修改。注意!保持linux联网正常,在网络环境不稳定的情况下,可能Maven构建会多次失败。
下述步骤为软件安装,本人安装盘符为/usr/local,以下命令执行的路径是在/usr/local目录。
Requirements:
* Windows System* JDK 1.6+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer
* Windows SDK or Visual Studio 2010 Professional
* Unix command-line tools from GnuWin32 or Cygwin: sh, mkdir, rm, cp, tar, gzip
* zlib headers (if building native code bindings for zlib)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
2.安装JDK
编译依赖JDK,本人使用jdk1.7,执行以下命令解压缩jdk
tar -zxvf jdk-7u45-linux-x64.tar.gz
会生成一个文件夹jdk1.7.0_45,然后设置环境变量中。
执行命令 vi /etc/profile,增加以下内容到配置文件中,如下:
JAVA_HOME='/usr/local/jdk'
HADOOP_HOME='/usr/local/hadoop'
ZOOKEEPER_HOME='/usr/local/zookeeper'
HBASE_HOME='/usr/local/hbase'
PIG_HOME='/usr/local/pig'
SQOOP_HOME='/usr/local/sqoop'
HIVE_HOME='/usr/local/hive'
FLUME_HOME='/usr/local/flume'
MAVEN_HOME=/usr/local/maven
PROTOC_HOME=/usr/local/protoc
ANT_HOME=/usr/local/ant
PATH=.:$JAVA_HOME/bin:$HADOOP_HOME/bin:$ZOOKEEPER_HOME/bin:$HBASE_HOME/bin:$PIG_HOME/bin:$HIVE_HOME/bin:$SQOOP_HOME/bin:$FLUME_HOME/bin:$ANT_HOME/bin:$MAVEN_HOME/bin:$PROTOC_HOME/bin:$PATH
CLASSPATH=.:$JAVA_HOME/lib/tools.jar:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/jre/lib/rt.jar:$CLASSPATH
export JAVA_HOME
export HADOOP_HOME
export ZOOKEEPER_HOME
export PATH
export CLASSPATH
export HBASE_HOME
export PIG_HOME
export HIVE_HOME
export SQOOP_HOME
export FLUME_HOME
export ANT_HOME
export FLUME_HOME
export MAVEN_HOME
export PROTOC_HOME
3.安装maven
hadoop源码是使用maven组织管理的,必须安装maven。注意选择3.0以上版本,maven同时依赖Ant,请自行安装Ant。
执行以下命令解压缩maven
tar -zxvf apache-maven-3.2.5-bin.tar.gz
会生成一个文件夹apache-maven-3.2.5,然后设置环境变量中。
执行命令vi /etc/profile,编辑结果参考2.安装JDK 。
4.安装findbugs(可选步骤)
findbugs是用于生成文档的。如果不需要编译生成文档,可以不执行该步骤。
执行以下命令解压缩findbugs
tar -zxvf findbugs-3.0.0-dev-20131204-e3cbbd5.tar.gz
会生成一个文件夹findbugs-3.0.0-dev-20131204-e3cbbd5,然后设置环境变量中。
执行命令vi /etc/profile,编辑修改。
注意:本人没有生成文档,如有需要请自行设置。
5.安装protoc
hadoop使用protocol buffer通信,选择protobuf-2.5.0.tar.gz 版本。
为了编译安装protoc,需要下载几个工具,顺序执行以下命令
yum install gcc
yum intall gcc-c++
yum install make
如果操作系统是CentOS6.5那么必须先配置网络yum源。如果是Rhel6则可用CentOS同版本号的网络yum源,其他操作系统未测试,请自行实验。在命令运行时,需要用户经常输入“y”。
然后执行以下命令解压缩protobuf
tar -zxvf protobuf-2.5.0.tar.gz
会生成一个文件夹protobuf-2.5.0,执行以下命令编译protobuf。
cd protobuf-2.5.0
./configure --prefix=/usr/local/protoc/
make && make install
执行完毕后,编译后的文件位于/usr/local/protoc/目录下,我们设置一下环境变量
执行命令vi /etc/profile,编辑结果参考2.安装JDK
6.安装其他依赖
顺序执行以下命令
yum install cmake
yum install openssl-devel
yum install ncurses-devel
安装完毕即可。注意!以上软件如果不安装会影响Hadoop2.6编译。
7.编译hadoop2.6源码
从hadoop官网下载2.6稳定版,下载hadoop-2.6.0-src.tar.gz。
执行以下命令解压缩hadoop2.6
tar -zxvf hadoop-2.6.0-src.tar.gz
会生成一个文件夹 hadoop-2.6.0-src。源代码中可能有个bug,这里需要修改一下,编辑目录/usr/local/hadoop-2.2.0-src/hadoop-common-project/hadoop-auth中的文件pom.xml,执行以下命令
vim pom.xml
在第55行下增加以下内容
<dependency>
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty-util</artifactId>
<scope>test</scope>
</dependency>
保存退出即可。
本人工作测试机已经存在以上内容,可能已经被fixed,请自行检查。
现在进入到目录/usr/local/hadoop-2.6.0-src中,执行命令
Create binary distribution without native code and without documentation:
$ mvn package -Pdist -DskipTests -Dtar
此处根据需要生成文档,如果配置了findbugs,可在此处执行命令
Create binary distribution with native code and with documentation:
$ mvn package -Pdist,native,docs -DskipTests -Dtar
该命令会从外网下载依赖的jar,编译hadoop源码,如果有更多需要,请参考BUILDING.txt
编译成功后结果:
[INFO] Apache Hadoop Main ................................ SUCCESS [6.936s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [4.928s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [9.399s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.871s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [7.981s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [8.965s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [39.748s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [11.081s]
[INFO] Apache Hadoop Common .............................. SUCCESS [10:41.466s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [26.346s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.061s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [12:49.368s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [41.896s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [41.043s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [9.650s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.051s]
[INFO] hadoop-yarn ....................................... SUCCESS [1:22.693s]
[INFO] hadoop-yarn-api ................................... SUCCESS [1:20.262s]
[INFO] hadoop-yarn-common ................................ SUCCESS [1:30.530s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.177s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [15.781s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [40.800s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [6.099s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [37.639s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [4.516s]
[INFO] hadoop-yarn-client ................................ SUCCESS [25.594s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.286s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [10.143s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.119s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [55.812s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [8.749s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.524s]
[INFO] hadoop-yarn-project ............................... SUCCESS [16.641s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [40.796s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [7.628s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [24.066s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [13.243s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [16.670s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [3.787s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [17.012s]
[INFO] hadoop-mapreduce .................................. SUCCESS [6.459s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [12.149s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [15.968s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [5.851s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [18.364s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [14.943s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [9.648s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [5.763s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [16.289s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [3.261s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.043s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [56.188s]
[INFO] Apache Hadoop Client .............................. SUCCESS [10.910s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.321s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 40:00.444s
[INFO] Finished at: Thu Dec 26 12:42:24 CST 2013
[INFO] Final Memory: 109M/362M
[INFO] ------------------------------------------------------------------------
以上选项全部SUCCESS,就标志成功编译。
编译成功后代码在/usr/local/hadoop-2.6.0-src/hadoop-dist/target下,即hadoop-2.6.0.tar.gz
更多推荐
所有评论(0)