HBase Snappy 压缩 - 创建失败 Table,CompressionTest 成功
HBase Snappy Compression - Failed to Create Table, CompressionTest Succeeded
我一直在尝试解决与 Snappy Compression 相关的 HBase 有线问题。以下是与此问题相关的所有内容的详细说明:
- 问题描述:
当我尝试在 HBase shell 中使用 Snappy 压缩创建 table 时:(有关详细信息,请参阅附件 shell 在调试模式下登录)
hbase(main):001:0> 创建 't3', { 名称 => 'cf1', 压缩 => 'SNAPPY' }
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/local/solono/package/local_1/Linux-2.6c2.5-x86_64/Hadoop/Hadoop-2140.0-0/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/local/solono/package/local_1/Linux-2.6c2.5-x86_64/Slf4j-log4j12/Slf4j-log4j12-37.0-0/lib/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
ERROR: java.io.IOException: Compression algorithm 'snappy' previously failed test.
at org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:85)
at org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1772)
at org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1765)
at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1747)
at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1782)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService.callBlockingMethod(MasterProtos.java:40470)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2012)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
at org.apache.hadoop.hbase.ipc.FifoRpcScheduler.run(FifoRpcScheduler.java:73)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
- 压缩测试结果:
当我运行压缩测试时,它成功了:
[kalidasyan@TestCluster-hbase-linux-host]/solono/env/TestClusterHBase% ./bin/solono-hbase org.apache.hadoop.hbase.util.CompressionTest hdfs://TestCluster-hadoop-nn2.aka.iad.TestCluster.com:9000/user/kalidasyan/hbase/impressions/00/part-m-00074.gz snappy
/solono/env/TestClusterHBase/bin/hbase-config.sh: line 43: cd: ../../../package/local_1/Linux-2.6c2.5-x86_64/Hbase/Hbase-521.0-0/bin: No such file or directory
2015-10-02 21:36:12,266 INFO [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/local/solono/package/local_1/Linux-2.6c2.5-x86_64/Hadoop/Hadoop-2140.0-0/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/local/solono/package/local_1/Linux-2.6c2.5-x86_64/Slf4j-log4j12/Slf4j-log4j12-37.0-0/lib/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2015-10-02 21:36:13,043 INFO [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
2015-10-02 21:36:13,044 INFO [main] util.ChecksumType: Checksum can use org.apache.hadoop.util.PureJavaCrc32C
2015-10-02 21:36:13,236 INFO [main] compress.CodecPool: Got brand-new compressor [.snappy]
2015-10-02 21:36:13,242 INFO [main] compress.CodecPool: Got brand-new compressor [.snappy]
2015-10-02 21:36:13,457 INFO [main] compress.CodecPool: Got brand-new decompressor [.snappy]
SUCCESS
- 原生库检查结果:
当我 运行 hadoop 本机库检查 HBase 时:
[kalidasyan@TestCluster-hbase-linux-host]/solono/env/TestClusterHBase% ./bin/solono-hbase --config ./var/hbase-config org.apache.hadoop.util.NativeLibraryChecker
/solono/env/TestClusterHBase/bin/hbase-config.sh: line 43: cd: ../../../package/local_1/Linux-2.6c2.5-x86_64/Hbase/Hbase-521.0-0/bin: No such file or directory
2015-10-04 23:44:09,747 WARN [main] bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version
2015-10-04 23:44:09,750 INFO [main] zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /local/solono/package/local_1/Linux-2.6c2.5-x86_64/Hadoop/Hadoop-2140.0-0/lib/libhadoop.so
zlib: true /lib64/libz.so.1
snappy: true /solono/env/TestClusterHBase/lib/libsnappy.so.1
lz4: true revision:99
bzip2: false
openssl: true /solono/env/TestClusterHBase/lib/libcrypto.so
我设置了如下HBase-site.xml属性,HMaster和RegionServer都能正常启动和工作
<property>
<name>hbase.regionserver.codecs</name>
<value>snappy</value>
</property>
- 任何 HBase 主机中的 HBase shell 进程:
kalidasyan 6942 0.7 0.6 10373900 775808 pts/0 Sl+ 21:32 0:15 /solono/env/TestClusterHBase/jdk/bin/java -Dproc_shell -XX:OnOutOfMemoryError=kill -9 %p -Xmx8192m -Dclient.encoding.override=UTF-8 -Dfile.encoding=UTF-8 -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false -Dsun.net.inetaddr.ttl=600 -Dsun.net.inetaddr.negative.ttl=300 -Dsolono.appgroup=productAds -Dorg.mortbay.util.FileResource.checkAliases=true -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/solono/env/TestClusterHBase/var/output/logsheapdump.hprof -XX:+UseConcMarkSweepGC -Dhbase.ruby.sources=/solono/env/TestClusterHBase/lib/ruby -Dhbase.log.dir=/solono/env/TestClusterHBase/var/output/logs -Dhbase.log.file=hbase.log -Dhbase.home.dir=/solono/env/TestClusterHBase -Dhbase.id.str= -Dhbase.root.logger=INFO,console -Djava.library.path=/solono/env/TestClusterHBase/lib/native/Linux-amd64-64 -Dhbase.security.logger=INFO,NullAppender org.jruby.Main -X+O /solono/env/TestClusterHBase/bin/hirb.rb
从进程参数中可以看出属性
"java.library.path=/solono/env/TestClusterHBase/lib/native/Linux-amd64-64"
而"ls -l /solono/env/TestClusterHBase/lib/native/Linux-amd64-64"显示:
lrwxrwxrwx 1 root root 92 Oct 2 21:26 libhadoopsnappy.la -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libhadoopsnappy.la
lrwxrwxrwx 1 root root 92 Oct 2 21:26 libhadoopsnappy.so -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libhadoopsnappy.so
lrwxrwxrwx 1 root root 94 Oct 2 21:26 libhadoopsnappy.so.0 -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libhadoopsnappy.so.0
lrwxrwxrwx 1 root root 98 Oct 2 21:26 libhadoopsnappy.so.0.0.1 -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libhadoopsnappy.so.0.0.1
lrwxrwxrwx 1 root root 86 Oct 2 21:26 libhadoop.so -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libhadoop.so
lrwxrwxrwx 1 root root 92 Oct 2 21:26 libhadoop.so.1.0.0 -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libhadoop.so.1.0.0
lrwxrwxrwx 1 root root 86 Oct 2 21:26 libsnappy.la -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libsnappy.la
lrwxrwxrwx 1 root root 86 Oct 2 21:26 libsnappy.so -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libsnappy.so
lrwxrwxrwx 1 root root 88 Oct 2 21:26 libsnappy.so.1 -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libsnappy.so.1
lrwxrwxrwx 1 root root 92 Oct 2 21:26 libsnappy.so.1.1.4 -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libsnappy.so.1.1.4
有人可以帮忙解决这个问题吗?任何建议或线索将不胜感激!!
提前致谢!!
Snappy 不能在 RHEL 操作系统上运行,因为它依赖于 RHEL 上不存在的其他库 (Clib),它适用于最新的 linux OS.
我一直在尝试解决与 Snappy Compression 相关的 HBase 有线问题。以下是与此问题相关的所有内容的详细说明:
- 问题描述: 当我尝试在 HBase shell 中使用 Snappy 压缩创建 table 时:(有关详细信息,请参阅附件 shell 在调试模式下登录)
hbase(main):001:0> 创建 't3', { 名称 => 'cf1', 压缩 => 'SNAPPY' }
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/local/solono/package/local_1/Linux-2.6c2.5-x86_64/Hadoop/Hadoop-2140.0-0/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/local/solono/package/local_1/Linux-2.6c2.5-x86_64/Slf4j-log4j12/Slf4j-log4j12-37.0-0/lib/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
ERROR: java.io.IOException: Compression algorithm 'snappy' previously failed test.
at org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:85)
at org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1772)
at org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1765)
at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1747)
at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1782)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService.callBlockingMethod(MasterProtos.java:40470)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2012)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
at org.apache.hadoop.hbase.ipc.FifoRpcScheduler.run(FifoRpcScheduler.java:73)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
- 压缩测试结果: 当我运行压缩测试时,它成功了:
[kalidasyan@TestCluster-hbase-linux-host]/solono/env/TestClusterHBase% ./bin/solono-hbase org.apache.hadoop.hbase.util.CompressionTest hdfs://TestCluster-hadoop-nn2.aka.iad.TestCluster.com:9000/user/kalidasyan/hbase/impressions/00/part-m-00074.gz snappy /solono/env/TestClusterHBase/bin/hbase-config.sh: line 43: cd: ../../../package/local_1/Linux-2.6c2.5-x86_64/Hbase/Hbase-521.0-0/bin: No such file or directory 2015-10-02 21:36:12,266 INFO [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/local/solono/package/local_1/Linux-2.6c2.5-x86_64/Hadoop/Hadoop-2140.0-0/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/local/solono/package/local_1/Linux-2.6c2.5-x86_64/Slf4j-log4j12/Slf4j-log4j12-37.0-0/lib/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2015-10-02 21:36:13,043 INFO [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32 2015-10-02 21:36:13,044 INFO [main] util.ChecksumType: Checksum can use org.apache.hadoop.util.PureJavaCrc32C 2015-10-02 21:36:13,236 INFO [main] compress.CodecPool: Got brand-new compressor [.snappy] 2015-10-02 21:36:13,242 INFO [main] compress.CodecPool: Got brand-new compressor [.snappy] 2015-10-02 21:36:13,457 INFO [main] compress.CodecPool: Got brand-new decompressor [.snappy] SUCCESS
- 原生库检查结果: 当我 运行 hadoop 本机库检查 HBase 时:
[kalidasyan@TestCluster-hbase-linux-host]/solono/env/TestClusterHBase% ./bin/solono-hbase --config ./var/hbase-config org.apache.hadoop.util.NativeLibraryChecker /solono/env/TestClusterHBase/bin/hbase-config.sh: line 43: cd: ../../../package/local_1/Linux-2.6c2.5-x86_64/Hbase/Hbase-521.0-0/bin: No such file or directory 2015-10-04 23:44:09,747 WARN [main] bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version 2015-10-04 23:44:09,750 INFO [main] zlib.ZlibFactory: Successfully loaded & initialized native-zlib library Native library checking: hadoop: true /local/solono/package/local_1/Linux-2.6c2.5-x86_64/Hadoop/Hadoop-2140.0-0/lib/libhadoop.so zlib: true /lib64/libz.so.1 snappy: true /solono/env/TestClusterHBase/lib/libsnappy.so.1 lz4: true revision:99 bzip2: false openssl: true /solono/env/TestClusterHBase/lib/libcrypto.so
我设置了如下HBase-site.xml属性,HMaster和RegionServer都能正常启动和工作
<property>
<name>hbase.regionserver.codecs</name>
<value>snappy</value>
</property>
- 任何 HBase 主机中的 HBase shell 进程:
kalidasyan 6942 0.7 0.6 10373900 775808 pts/0 Sl+ 21:32 0:15 /solono/env/TestClusterHBase/jdk/bin/java -Dproc_shell -XX:OnOutOfMemoryError=kill -9 %p -Xmx8192m -Dclient.encoding.override=UTF-8 -Dfile.encoding=UTF-8 -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false -Dsun.net.inetaddr.ttl=600 -Dsun.net.inetaddr.negative.ttl=300 -Dsolono.appgroup=productAds -Dorg.mortbay.util.FileResource.checkAliases=true -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/solono/env/TestClusterHBase/var/output/logsheapdump.hprof -XX:+UseConcMarkSweepGC -Dhbase.ruby.sources=/solono/env/TestClusterHBase/lib/ruby -Dhbase.log.dir=/solono/env/TestClusterHBase/var/output/logs -Dhbase.log.file=hbase.log -Dhbase.home.dir=/solono/env/TestClusterHBase -Dhbase.id.str= -Dhbase.root.logger=INFO,console -Djava.library.path=/solono/env/TestClusterHBase/lib/native/Linux-amd64-64 -Dhbase.security.logger=INFO,NullAppender org.jruby.Main -X+O /solono/env/TestClusterHBase/bin/hirb.rb
从进程参数中可以看出属性
"java.library.path=/solono/env/TestClusterHBase/lib/native/Linux-amd64-64"
而"ls -l /solono/env/TestClusterHBase/lib/native/Linux-amd64-64"显示:
lrwxrwxrwx 1 root root 92 Oct 2 21:26 libhadoopsnappy.la -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libhadoopsnappy.la
lrwxrwxrwx 1 root root 92 Oct 2 21:26 libhadoopsnappy.so -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libhadoopsnappy.so
lrwxrwxrwx 1 root root 94 Oct 2 21:26 libhadoopsnappy.so.0 -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libhadoopsnappy.so.0
lrwxrwxrwx 1 root root 98 Oct 2 21:26 libhadoopsnappy.so.0.0.1 -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libhadoopsnappy.so.0.0.1
lrwxrwxrwx 1 root root 86 Oct 2 21:26 libhadoop.so -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libhadoop.so
lrwxrwxrwx 1 root root 92 Oct 2 21:26 libhadoop.so.1.0.0 -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libhadoop.so.1.0.0
lrwxrwxrwx 1 root root 86 Oct 2 21:26 libsnappy.la -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libsnappy.la
lrwxrwxrwx 1 root root 86 Oct 2 21:26 libsnappy.so -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libsnappy.so
lrwxrwxrwx 1 root root 88 Oct 2 21:26 libsnappy.so.1 -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libsnappy.so.1
lrwxrwxrwx 1 root root 92 Oct 2 21:26 libsnappy.so.1.1.4 -> /solono/_env/TestClusterHBase-swit1na.7444503.167194907.744537033/lib/libsnappy.so.1.1.4
有人可以帮忙解决这个问题吗?任何建议或线索将不胜感激!!
提前致谢!!
Snappy 不能在 RHEL 操作系统上运行,因为它依赖于 RHEL 上不存在的其他库 (Clib),它适用于最新的 linux OS.