Quantcast

CompressionTest failing

classic Classic list List threaded Threaded
8 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

CompressionTest failing

dwijesinghe
Hi,

I am trying to install snappy compression for HBase. I believe I have installed the library and wanted to check by using the CompressionTest utility. I am issuing the following command:

bin/hbase org.apache.hadoop.hbase.util.CompressionTest hdfs://C-Master:/usr/local/hadoop/hbase snappy

However, when I run the command I get the following output:

13/11/27 14:29:20 INFO ipc.Client: Retrying connect to server: C-Master/10.171.18.245:8020. Already tried 0 time(s).
13/11/27 14:29:21 INFO ipc.Client: Retrying connect to server: C-Master/10.171.18.245:8020. Already tried 1 time(s).
13/11/27 14:29:22 INFO ipc.Client: Retrying connect to server: C-Master/10.171.18.245:8020. Already tried 2 time(s).
13/11/27 14:29:23 INFO ipc.Client: Retrying connect to server: C-Master/10.171.18.245:8020. Already tried 3 time(s).
13/11/27 14:29:24 INFO ipc.Client: Retrying connect to server: C-Master/10.171.18.245:8020. Already tried 4 time(s).
13/11/27 14:29:25 INFO ipc.Client: Retrying connect to server: C-Master/10.171.18.245:8020. Already tried 5 time(s).
13/11/27 14:29:26 INFO ipc.Client: Retrying connect to server: C-Master/10.171.18.245:8020. Already tried 6 time(s).
13/11/27 14:29:27 INFO ipc.Client: Retrying connect to server: C-Master/10.171.18.245:8020. Already tried 7 time(s).
13/11/27 14:29:28 INFO ipc.Client: Retrying connect to server: C-Master/10.171.18.245:8020. Already tried 8 time(s).
13/11/27 14:29:29 INFO ipc.Client: Retrying connect to server: C-Master/10.171.18.245:8020. Already tried 9 time(s).
Exception in thread "main" java.net.ConnectException: Call to C-Master/10.171.18.245:8020 failed on connection exception: java.net.ConnectException: Connection refused
        at org.apache.hadoop.ipc.Client.wrapException(Client.java:1099)
        at org.apache.hadoop.ipc.Client.call(Client.java:1075)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
        at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
        at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
        at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:187)
        at org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:136)
Caused by: java.net.ConnectException: Connection refused
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:692)
        at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:489)
        at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
        at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1206)
        at org.apache.hadoop.ipc.Client.call(Client.java:1050)
        ... 14 more

My HBase cluster is running without issue, so I am puzzled as to why the test utility is having difficulty connecting to the master node (particularly since I was running the test on the master node itself). Have I missed something/used improper syntax/etc?

Any help would be greatly appreciated.

Thank you.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: CompressionTest failing

Ted Yu-3
bq. hdfs://C-Master*:*/

Did you actually typed the second colon ?


On Wed, Nov 27, 2013 at 6:58 AM, dwijesinghe <[hidden email]
> wrote:

> Hi,
>
> I am trying to install snappy compression for HBase. I believe I have
> installed the library and wanted to check by using the CompressionTest
> utility. I am issuing the following command:
>
> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> hdfs://C-Master:/usr/local/hadoop/hbase snappy
>
> However, when I run the command I get the following output:
>
> 13/11/27 14:29:20 INFO ipc.Client: Retrying connect to server:
> C-Master/10.171.18.245:8020. Already tried 0 time(s).
> 13/11/27 14:29:21 INFO ipc.Client: Retrying connect to server:
> C-Master/10.171.18.245:8020. Already tried 1 time(s).
> 13/11/27 14:29:22 INFO ipc.Client: Retrying connect to server:
> C-Master/10.171.18.245:8020. Already tried 2 time(s).
> 13/11/27 14:29:23 INFO ipc.Client: Retrying connect to server:
> C-Master/10.171.18.245:8020. Already tried 3 time(s).
> 13/11/27 14:29:24 INFO ipc.Client: Retrying connect to server:
> C-Master/10.171.18.245:8020. Already tried 4 time(s).
> 13/11/27 14:29:25 INFO ipc.Client: Retrying connect to server:
> C-Master/10.171.18.245:8020. Already tried 5 time(s).
> 13/11/27 14:29:26 INFO ipc.Client: Retrying connect to server:
> C-Master/10.171.18.245:8020. Already tried 6 time(s).
> 13/11/27 14:29:27 INFO ipc.Client: Retrying connect to server:
> C-Master/10.171.18.245:8020. Already tried 7 time(s).
> 13/11/27 14:29:28 INFO ipc.Client: Retrying connect to server:
> C-Master/10.171.18.245:8020. Already tried 8 time(s).
> 13/11/27 14:29:29 INFO ipc.Client: Retrying connect to server:
> C-Master/10.171.18.245:8020. Already tried 9 time(s).
> Exception in thread "main" java.net.ConnectException: Call to
> C-Master/10.171.18.245:8020 failed on connection exception:
> java.net.ConnectException: Connection refused
>         at org.apache.hadoop.ipc.Client.wrapException(Client.java:1099)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1075)
>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>         at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>         at
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>         at
>
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>         at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>         at org.apache.hadoop.fs.Path.getFileSystem(Path.java:187)
>         at
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:136)
> Caused by: java.net.ConnectException: Connection refused
>         at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>         at
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:692)
>         at
>
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
>         at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:489)
>         at
> org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
>         at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
>         at
> org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
>         at org.apache.hadoop.ipc.Client.getConnection(Client.java:1206)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1050)
>         ... 14 more
>
> My HBase cluster is running without issue, so I am puzzled as to why the
> test utility is having difficulty connecting to the master node
> (particularly since I was running the test on the master node itself). Have
> I missed something/used improper syntax/etc?
>
> Any help would be greatly appreciated.
>
> Thank you.
>
>
>
> --
> View this message in context:
> http://apache-hbase.679495.n3.nabble.com/CompressionTest-failing-tp4053156.html
> Sent from the HBase User mailing list archive at Nabble.com.
>
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: CompressionTest failing

dwijesinghe
I tried both with and without (was unsure of the syntax for specifying the hbase path). Same result either way.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: CompressionTest failing

Ted Yu-3
Log snippet shows 8020.
Is that the correct port ?

BTW '/usr/local/hadoop' seems to indicate a local path.


On Wed, Nov 27, 2013 at 7:15 AM, dwijesinghe <[hidden email]
> wrote:

> I tried both with and without (was unsure of the syntax for specifying the
> hbase path). Same result either way.
>
>
>
> --
> View this message in context:
> http://apache-hbase.679495.n3.nabble.com/CompressionTest-failing-tp4053156p4053159.html
> Sent from the HBase User mailing list archive at Nabble.com.
>
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: CompressionTest failing

dwijesinghe
Thank you for your reply.

That port is open for access, as stated in my security rules for the cluster. Is there any further configuration I should do to that port/should it be using another port?

Also I was a little confused by the instructions there. It prefixed the path with hdfs:// but it said to specify the path to the hadoop/hbase installation, which would be a local path from within the HBase cluster. What path should I be supplying?

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: CompressionTest failing

Ted Yu-3
Here is example given by CompressionTest:

      "For example:\n" +

      "  hbase " + CompressionTest.class + " file:///tmp/testfile gz\n");

CompressionTest would write to this file. This is to verify that
compression works on each node.

You should specify "file:" as scheme.


Cheers


On Wed, Nov 27, 2013 at 7:58 AM, dwijesinghe <[hidden email]
> wrote:

> Thank you for your reply.
>
> That port is open for access, as stated in my security rules for the
> cluster. Is there any further configuration I should do to that port/should
> it be using another port?
>
> Also I was a little confused by the instructions there. It prefixed the
> path
> with hdfs:// but it said to specify the path to the hadoop/hbase
> installation, which would be a local path from within the HBase cluster.
> What path should I be supplying?
>
>
>
>
>
> --
> View this message in context:
> http://apache-hbase.679495.n3.nabble.com/CompressionTest-failing-tp4053156p4053161.html
> Sent from the HBase User mailing list archive at Nabble.com.
>
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: CompressionTest failing

dwijesinghe
Ted,

Thank you so much for your help. I was able to successfully test my snappy installation by following that example.

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: CompressionTest failing

Ted Yu-3
You're welcome.

Actually the ref guide mentions this tool:
http://hbase.apache.org/book.html#compression.test


On Wed, Nov 27, 2013 at 8:52 AM, dwijesinghe <[hidden email]
> wrote:

> Ted,
>
> Thank you so much for your help. I was able to successfully test my snappy
> installation by following that example.
>
>
>
>
>
> --
> View this message in context:
> http://apache-hbase.679495.n3.nabble.com/CompressionTest-failing-tp4053156p4053166.html
> Sent from the HBase User mailing list archive at Nabble.com.
>
Loading...