本篇內(nèi)容介紹了“hadoop-2.6.2 lzo的配置過程”的有關(guān)知識(shí),在實(shí)際案例的操作過程中,不少人都會(huì)遇到這樣的困境,接下來就讓小編帶領(lǐng)大家學(xué)習(xí)一下如何處理這些情況吧!希望大家仔細(xì)閱讀,能夠?qū)W有所成!
成都創(chuàng)新互聯(lián)主營(yíng)淄博網(wǎng)站建設(shè)的網(wǎng)絡(luò)公司,主營(yíng)網(wǎng)站建設(shè)方案,成都APP應(yīng)用開發(fā),淄博h5小程序設(shè)計(jì)搭建,淄博網(wǎng)站營(yíng)銷推廣歡迎淄博等地區(qū)企業(yè)咨詢
集群有三臺(tái)主機(jī),主機(jī)名分別是:bi10,bi12,bi13。我們的操作都在bi10上面進(jìn)行。
安裝lzo需要一些依賴包,如果你已經(jīng)安裝過了,那么可以跳過這一步。首先你需要切換到root用戶下
yum install gcc gcc-c++ kernel-devel yum install git
除了以上兩個(gè)之外,你還需要配置maven環(huán)境,下載之后直接解壓并配置環(huán)境變量即可使用
wget http://apache.fayea.com/maven/maven-3/3.3.9/binaries/apache-maven-3.3.9-bin.tar.gz tar -xzf apache-maven-3.3.9-bin.tar.gz
配置maven環(huán)境變量,maven軟件包放置到/home/hadoop/work/apache-maven-3.3.9
[hadoop@bi10 hadoop-2.6.2]$ vim ~/.bash_profile #init maven environment export MAVEN_HOME=/home/hadoop/work/apache-maven-3.3.9 export PATH=$PATH:$MAVEN_HOME/bin
下載lzo安裝包
[hadoop@bi10 apps]$ wget http://www.oberhumer.com/opensource/lzo/download/lzo-2.09.tar.gz
解壓并編譯安裝lzo到:/usr/local/hadoop/lzo/,安裝時(shí)切換到root用戶下
[hadoop@bi10 apps]$ tar -xzf lzo-2.09.tar.gz [hadoop@bi10 apps]$ cd lzo-2.09 [hadoop@bi10 apps]$ su root [root@bi10 lzo-2.09]$ ./configure -enable-shared -prefix=/usr/local/hadoop/lzo/ [root@bi10 lzo-2.09]$ make && make test && make install
查看安裝目錄
[hadoop@bi10 lzo-2.09]$ ls /usr/local/hadoop/lzo/ include lib share
下載hadoop-lzo
git clone https://github.com/twitter/hadoop-lzo.git
設(shè)置環(huán)境變量,并使用maven編譯
[hadoop@bi10 hadoop-lzo]$ export CFLAGS=-m64 [hadoop@bi10 hadoop-lzo]$ export CXXFLAGS=-m64 [hadoop@bi10 hadoop-lzo]$ export C_INCLUDE_PATH=/usr/local/hadoop/lzo/include [hadoop@bi10 hadoop-lzo]$ export LIBRARY_PATH=/usr/local/hadoop/lzo/lib [hadoop@bi10 hadoop-lzo]$ mvn clean package -Dmaven.test.skip=true
將編譯好的文件拷貝到hadoop的安裝目錄
[hadoop@bi10 hadoop-lzo]$ tar -cBf - -C target/native/Linux-amd64-64/lib . | tar -xBvf - -C $HADOOP_HOME/lib/native/ [hadoop@bi10 hadoop-lzo]$ cp target/hadoop-lzo-0.4.20-SNAPSHOT.jar $HADOOP_HOME/share/hadoop/common/ [hadoop@bi10 hadoop-lzo]$ scp target/hadoop-lzo-0.4.20-SNAPSHOT.jar bi12:$HADOOP_HOME/share/hadoop/common/ [hadoop@bi10 hadoop-lzo]$ scp target/hadoop-lzo-0.4.20-SNAPSHOT.jar bi13:$HADOOP_HOME/share/hadoop/common/
將編譯好的文件分別復(fù)制到集群其他機(jī)器對(duì)應(yīng)的目錄,其中native目錄需要先打包再拷貝到集群的其他機(jī)器上,然后解壓。
tar -czf hadoop-native.tar.gz /$HADOOP_HOME/lib/native/ scp hadoop-native.tar.gz bi12:/$HADOOP_HOME/lib scp hadoop-native.tar.gz bi13:/$HADOOP_HOME/lib
修改hadoop-env.sh,增加一條
# The lzo library export LD_LIBRARY_PATH=/usr/local/hadoop/lzo/lib
修改core-site.xml
<property> <name>io.compression.codecs</name> <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec,org.apache.hadoop.io.compress.BZip2Codec</value> </property> <property> <name>io.compression.codec.lzo.class</name> <value>com.hadoop.compression.lzo.LzoCodec</value> </property>
修改mapred-site.xml
<!-- lzo壓縮 --> <property> <name>mapred.compress.map.output</name> <value>true</value> </property> <property> <name>mapred.map.output.compression.codec</name> <value>com.hadoop.compression.lzo.LzoCodec</value> </property> <property> <name>mapred.child.env</name> <value>LD_LIBRARY_PATH=/usr/local/hadoop/lzo/lib</value> </property>
拷貝三個(gè)配置文件到集群其他機(jī)器
scp etc/hadoop/hadoop-env.sh bi12:/home/hadoop/work/hadoop-2.6.2/etc/hadoop/ scp etc/hadoop/hadoop-env.sh bi13:/home/hadoop/work/hadoop-2.6.2/etc/hadoop/ scp etc/hadoop/core-site.xml bi12:/home/hadoop/work/hadoop-2.6.2/etc/hadoop/ scp etc/hadoop/core-site.xml bi13:/home/hadoop/work/hadoop-2.6.2/etc/hadoop/ scp etc/hadoop/mapred-site.xml bi12:/home/hadoop/work/hadoop-2.6.2/etc/hadoop/ scp etc/hadoop/mapred-site.xml bi13:/home/hadoop/work/hadoop-2.6.2/etc/hadoop/
安裝lzop,需要切換到root用戶下
yum install lzop
進(jìn)入hadoop安裝目錄然后對(duì)LICENSE.txt執(zhí)行l(wèi)zo壓縮,會(huì)生成一個(gè)lzo壓縮文件LICENSE.txt.lzo
lzop LICENSE.txt
上傳壓縮文件到hdfs
[hadoop@bi10 hadoop-2.6.2]$ hdfs dfs -mkdir /user/hadoop/wordcount/lzoinput [hadoop@bi10 hadoop-2.6.2]$ hdfs dfs -put LICENSE.txt.lzo /user/hadoop/wordcount/lzoinput [hadoop@bi10 hadoop-2.6.2]$ hdfs dfs -ls /user/hadoop/wordcount/lzoinput Found 1 items -rw-r--r-- 2 hadoop supergroup 7773 2016-02-16 20:59 /user/hadoop/wordcount/lzoinput/LICENSE.txt.lzo
對(duì)lzo壓縮文件建立索引
hadoop jar ./share/hadoop/common/hadoop-lzo-0.4.20-SNAPSHOT.jar com.hadoop.compression.lzo.DistributedLzoIndexer /user/hadoop/wordcount/lzoinput/ [hadoop@bi10 hadoop-2.6.2]$ hdfs dfs -ls /user/hadoop/wordcount/lzoinput/ Found 2 items -rw-r--r-- 2 hadoop supergroup 7773 2016-02-16 20:59 /user/hadoop/wordcount/lzoinput/LICENSE.txt.lzo -rw-r--r-- 2 hadoop supergroup 8 2016-02-16 21:02 /user/hadoop/wordcount/lzoinput/LICENSE.txt.lzo.index
對(duì)lzo壓縮文件執(zhí)行wordcount
hadoop jar ./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.2.jar wordcount /user/hadoop/wordcount/lzoinput/ /user/hadoop/wordcount/output2
“hadoop-2.6.2 lzo的配置過程”的內(nèi)容就介紹到這里了,感謝大家的閱讀。如果想了解更多行業(yè)相關(guān)的知識(shí)可以關(guān)注創(chuàng)新互聯(lián)網(wǎng)站,小編將為大家輸出更多高質(zhì)量的實(shí)用文章!
分享題目:hadoop-2.6.2lzo的配置過程
文章地址:http://bm7419.com/article48/gosphp.html
成都網(wǎng)站建設(shè)公司_創(chuàng)新互聯(lián),為您提供用戶體驗(yàn)、品牌網(wǎng)站制作、微信公眾號(hào)、服務(wù)器托管、標(biāo)簽優(yōu)化、網(wǎng)站維護(hù)
聲明:本網(wǎng)站發(fā)布的內(nèi)容(圖片、視頻和文字)以用戶投稿、用戶轉(zhuǎn)載內(nèi)容為主,如果涉及侵權(quán)請(qǐng)盡快告知,我們將會(huì)在第一時(shí)間刪除。文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如需處理請(qǐng)聯(lián)系客服。電話:028-86922220;郵箱:631063699@qq.com。內(nèi)容未經(jīng)允許不得轉(zhuǎn)載,或轉(zhuǎn)載時(shí)需注明來源: 創(chuàng)新互聯(lián)