Need help in writing data to hbase using Ozzie job

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Need help in writing data to hbase using Ozzie job

Jeetendra Gangele
Hi All,
I have a job with bring the data from MYSQL to Hbase using Sqoop. This job
is running fine when I run command using shell,But its giving me error when
run through oozie looks like oozie is not able to read the Hbase master
configuration, any help?

>>> Invoking Sqoop command line now >>>

3168 [main] WARN  org.apache.sqoop.tool.SqoopTool  - $SQOOP_CONF_DIR has
not been set in the environment. Cannot check for additional configuration.
3189 [main] INFO  org.apache.sqoop.Sqoop  - Running Sqoop version:
1.4.6-cdh5.9.1
3263 [main] WARN  org.apache.sqoop.ConnFactory  - $SQOOP_CONF_DIR has not
been set in the environment. Cannot check for additional configuration.
3328 [main] INFO  org.apache.sqoop.manager.MySQLManager  - Preparing to use
a MySQL streaming resultset.
3328 [main] INFO  org.apache.sqoop.tool.CodeGenTool  - Beginning code
generation
3580 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL
statement: SELECT t.* FROM `tblBDMaster` AS t LIMIT 1
3597 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL
statement: SELECT t.* FROM `tblBDMaster` AS t LIMIT 1
3601 [main] INFO  org.apache.sqoop.orm.CompilationManager  -
HADOOP_MAPRED_HOME is
/opt/cloudera/parcels/CDH-5.9.1-1.cdh5.9.1.p0.4/lib/hadoop-mapreduce
5180 [main] INFO  org.apache.sqoop.orm.CompilationManager  - Writing jar
file:
/tmp/sqoop-yarn/compile/847aa7cceba7c11787ce68af176ac1dc/tblBDMaster.jar
5245 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL
statement: SELECT t.* FROM `tblBDMaster` AS t LIMIT 1
5247 [main] INFO  org.apache.sqoop.tool.ImportTool  - Incremental import
based on column `BDMaster_dtmLastModified`
5247 [main] INFO  org.apache.sqoop.tool.ImportTool  - Upper bound value:
'2017-05-31 18:10:06.0'
5247 [main] WARN  org.apache.sqoop.manager.MySQLManager  - It looks like
you are importing from mysql.
5248 [main] WARN  org.apache.sqoop.manager.MySQLManager  - This transfer
can be faster! Use the --direct
5248 [main] WARN  org.apache.sqoop.manager.MySQLManager  - option to
exercise a MySQL-specific fast path.
5248 [main] INFO  org.apache.sqoop.manager.MySQLManager  - Setting zero
DATETIME behavior to convertToNull (mysql)
5346 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Beginning
import of tblBDMaster
5388 [main] WARN  org.apache.sqoop.mapreduce.JobBase  - SQOOP_HOME is
unset. May not be able to find all job dependencies.
Heart beat
54328 [main] ERROR org.apache.sqoop.tool.ImportTool  - Encountered
IOException running import job:
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
attempts=36, exceptions:
Wed May 31 18:10:55 IST 2017, null, java.net.SocketTimeoutException:
callTimeout=60000, callDuration=68404: row 'transaction:tblBDMaster,,' on
table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=
hadoop-node-3.sit.n3b.bookmyshow.org,60020,1496231117907, seqNum=0

at
org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:286)
at
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:231)
at
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:61)
at
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320)
at
org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:295)
at
org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:160)
at
org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:155)
at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:867)
at
org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:602)
at
org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:366)
at
org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:410)
at
org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:420)
at
org.apache.sqoop.mapreduce.HBaseImportJob.jobSetup(HBaseImportJob.java:217)
at
org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:271)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:193)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:176)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:56)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:48)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:231)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1714)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.net.SocketTimeoutException: callTimeout=60000,
callDuration=68404: row 'transaction:tblBDMaster,,' on table 'hbase:meta'
at region=hbase:meta,,1.1588230740, hostname=
hadoop-node-3.sit.n3b.bookmyshow.org,60020,1496231117907, seqNum=0
at
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:159)
at
org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: com.google.protobuf.ServiceException:
java.lang.NoClassDefFoundError: com/yammer/metrics/core/Gauge
at
org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:329)
at
org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:402)
at
org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:203)
at
org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:64)
at
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:381)
at
org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:355)
at
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
... 4 more
Caused by: com.google.protobuf.ServiceException:
java.lang.NoClassDefFoundError: com/yammer/metrics/core/Gauge
at
org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:240)
at
org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:34094)
at
org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:394)
... 10 more
Caused by: java.lang.NoClassDefFoundError: com/yammer/metrics/core/Gauge
at
org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:225)
... 13 more
Caused by: java.lang.ClassNotFoundException: com.yammer.metrics.core.Gauge
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)


Regards
Jeetendra
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Need help in writing data to hbase using Ozzie job

Ted Yu-3
bq. java.lang.NoClassDefFoundError: com/yammer/metrics/core/Gauge

Looks like yammer jar was not on the classpath.

Please check.

On Sun, Jun 4, 2017 at 11:43 PM, Jeetendra Gangele <[hidden email]>
wrote:

> Hi All,
> I have a job with bring the data from MYSQL to Hbase using Sqoop. This job
> is running fine when I run command using shell,But its giving me error when
> run through oozie looks like oozie is not able to read the Hbase master
> configuration, any help?
>
> >>> Invoking Sqoop command line now >>>
>
> 3168 [main] WARN  org.apache.sqoop.tool.SqoopTool  - $SQOOP_CONF_DIR has
> not been set in the environment. Cannot check for additional configuration.
> 3189 [main] INFO  org.apache.sqoop.Sqoop  - Running Sqoop version:
> 1.4.6-cdh5.9.1
> 3263 [main] WARN  org.apache.sqoop.ConnFactory  - $SQOOP_CONF_DIR has not
> been set in the environment. Cannot check for additional configuration.
> 3328 [main] INFO  org.apache.sqoop.manager.MySQLManager  - Preparing to
> use
> a MySQL streaming resultset.
> 3328 [main] INFO  org.apache.sqoop.tool.CodeGenTool  - Beginning code
> generation
> 3580 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL
> statement: SELECT t.* FROM `tblBDMaster` AS t LIMIT 1
> 3597 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL
> statement: SELECT t.* FROM `tblBDMaster` AS t LIMIT 1
> 3601 [main] INFO  org.apache.sqoop.orm.CompilationManager  -
> HADOOP_MAPRED_HOME is
> /opt/cloudera/parcels/CDH-5.9.1-1.cdh5.9.1.p0.4/lib/hadoop-mapreduce
> 5180 [main] INFO  org.apache.sqoop.orm.CompilationManager  - Writing jar
> file:
> /tmp/sqoop-yarn/compile/847aa7cceba7c11787ce68af176ac1dc/tblBDMaster.jar
> 5245 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL
> statement: SELECT t.* FROM `tblBDMaster` AS t LIMIT 1
> 5247 [main] INFO  org.apache.sqoop.tool.ImportTool  - Incremental import
> based on column `BDMaster_dtmLastModified`
> 5247 [main] INFO  org.apache.sqoop.tool.ImportTool  - Upper bound value:
> '2017-05-31 18:10:06.0'
> 5247 [main] WARN  org.apache.sqoop.manager.MySQLManager  - It looks like
> you are importing from mysql.
> 5248 [main] WARN  org.apache.sqoop.manager.MySQLManager  - This transfer
> can be faster! Use the --direct
> 5248 [main] WARN  org.apache.sqoop.manager.MySQLManager  - option to
> exercise a MySQL-specific fast path.
> 5248 [main] INFO  org.apache.sqoop.manager.MySQLManager  - Setting zero
> DATETIME behavior to convertToNull (mysql)
> 5346 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Beginning
> import of tblBDMaster
> 5388 [main] WARN  org.apache.sqoop.mapreduce.JobBase  - SQOOP_HOME is
> unset. May not be able to find all job dependencies.
> Heart beat
> 54328 [main] ERROR org.apache.sqoop.tool.ImportTool  - Encountered
> IOException running import job:
> org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
> attempts=36, exceptions:
> Wed May 31 18:10:55 IST 2017, null, java.net.SocketTimeoutException:
> callTimeout=60000, callDuration=68404: row 'transaction:tblBDMaster,,' on
> table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=
> hadoop-node-3.sit.n3b.bookmyshow.org,60020,1496231117907, seqNum=0
>
> at
> org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadRepli
> cas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:286)
> at
> org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(
> ScannerCallableWithReplicas.java:231)
> at
> org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(
> ScannerCallableWithReplicas.java:61)
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(
> RpcRetryingCaller.java:200)
> at org.apache.hadoop.hbase.client.ClientScanner.call(
> ClientScanner.java:320)
> at
> org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.
> java:295)
> at
> org.apache.hadoop.hbase.client.ClientScanner.
> initializeScannerInConstruction(ClientScanner.java:160)
> at
> org.apache.hadoop.hbase.client.ClientScanner.<init>(
> ClientScanner.java:155)
> at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:867)
> at
> org.apache.hadoop.hbase.MetaTableAccessor.fullScan(
> MetaTableAccessor.java:602)
> at
> org.apache.hadoop.hbase.MetaTableAccessor.tableExists(
> MetaTableAccessor.java:366)
> at
> org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:410)
> at
> org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:420)
> at
> org.apache.sqoop.mapreduce.HBaseImportJob.jobSetup(
> HBaseImportJob.java:217)
> at
> org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:271)
> at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
> at org.apache.sqoop.manager.MySQLManager.importTable(
> MySQLManager.java:127)
> at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507)
> at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
> at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
> at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
> at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
> at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
> at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(
> SqoopMain.java:193)
> at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:176)
> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:56)
> at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:48)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
> 57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:231)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1714)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.net.SocketTimeoutException: callTimeout=60000,
> callDuration=68404: row 'transaction:tblBDMaster,,' on table 'hbase:meta'
> at region=hbase:meta,,1.1588230740, hostname=
> hadoop-node-3.sit.n3b.bookmyshow.org,60020,1496231117907, seqNum=0
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(
> RpcRetryingCaller.java:159)
> at
> org.apache.hadoop.hbase.client.ResultBoundedCompletionService
> $QueueingFuture.run(ResultBoundedCompletionService.java:80)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: com.google.protobuf.ServiceException:
> java.lang.NoClassDefFoundError: com/yammer/metrics/core/Gauge
> at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(
> ProtobufUtil.java:329)
> at
> org.apache.hadoop.hbase.client.ScannerCallable.
> openScanner(ScannerCallable.java:402)
> at
> org.apache.hadoop.hbase.client.ScannerCallable.call(
> ScannerCallable.java:203)
> at
> org.apache.hadoop.hbase.client.ScannerCallable.call(
> ScannerCallable.java:64)
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(
> RpcRetryingCaller.java:200)
> at
> org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$
> RetryingRPC.call(ScannerCallableWithReplicas.java:381)
> at
> org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$
> RetryingRPC.call(ScannerCallableWithReplicas.java:355)
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(
> RpcRetryingCaller.java:126)
> ... 4 more
> Caused by: com.google.protobuf.ServiceException:
> java.lang.NoClassDefFoundError: com/yammer/metrics/core/Gauge
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(
> AbstractRpcClient.java:240)
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$
> BlockingRpcChannelImplementation.callBlockingMethod(
> AbstractRpcClient.java:336)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$
> BlockingStub.scan(ClientProtos.java:34094)
> at
> org.apache.hadoop.hbase.client.ScannerCallable.
> openScanner(ScannerCallable.java:394)
> ... 10 more
> Caused by: java.lang.NoClassDefFoundError: com/yammer/metrics/core/Gauge
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(
> AbstractRpcClient.java:225)
> ... 13 more
> Caused by: java.lang.ClassNotFoundException: com.yammer.metrics.core.Gauge
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>
>
> Regards
> Jeetendra
>
Loading...