Need some help? LongWritable cast to Text Error

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Need some help? LongWritable cast to Text Error

William Clay Moody
Newbie Question:

I am trying to write a MapReduce from a text file in HDFS into a HBase  
table. I am getting the following run-time error

08/06/05 17:24:08 INFO mapred.JobClient: Task Id :  
task_200806041619_0009_m_000001_0, Status : FAILED
java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot  
be cast to org.apache.hadoop.io.Text
        at edu.ncsu.csc.osr.anansi.BuildTableMR
$MyMapper.map(BuildTableMR.java:22)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:208)
        at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:
2084)

Line 22 of BuildTableMR.java is:

public static class MyMapper extends MapReduceBase implements  
Mapper<Text, Text, Text, Text> {

and map function is as follows:

public void map(Text key, Text value, OutputCollector<Text, Text>  
output, Reporter reporter) throws IOException {

LongWritable is not used throughout the entire program. Not sure what  
is going on since the Mapper Interface allows WritableComparables and  
Writables in which Text qualifies. I have seen the same use of Mapper  
in other provided samples.

Entire code pasted below. Thanks in advance, Clay

       
public class BuildTableMR extends Configured{

        public static class MyMapper extends MapReduceBase implements  
Mapper<Text, Text, Text, Text> {
                private Text src = new Text();
                private Text dest = new Text();

                public void map(Text key, Text value, OutputCollector<Text, Text>  
output, Reporter reporter) throws IOException {
                        String line = value.toString();
                        if (line.length() == 0) {
                                return;
                        }
                        String [] splits = value.toString().split(" ");
                        src.set(splits[0]);
                        dest.set(splits[1]);
                        output.collect(src, dest);
                }
        }

        private static class MyReducer extends TableReduce<Text, Text> {

                private static final Text fam = new Text("out_edge:");
                private final static IntWritable one = new IntWritable(1);
                private MapWritable map = new MapWritable();
                public void reduce(Text key, Iterator<Text> values,
OutputCollector<Text, MapWritable> output, Reporter reporter) throws  
IOException {
                        Text qual = values.next();
                        map.clear();
                        Text col = new Text (fam.toString() + qual.toString());
                        map.put(col, new ImmutableBytesWritable(qual.getBytes()));

                        output.collect(new Text(key.toString()), map);
                }
        }

        public static void main(String[] args) throws IOException {

                String filename = args[0];
                int mapTasks = 2;
                int reduceTasks = 2;
                JobConf conf = new JobConf(BuildTableMR.class);
                conf.setJobName("BuildTable");
                TableReduce.initJob("anansi", MyReducer.class, conf);
                conf.setNumMapTasks(mapTasks);
                conf.setNumReduceTasks(reduceTasks);
                conf.setInputPath(new Path(filename));
                conf.setMapperClass(MyMapper.class);
                conf.setCombinerClass(MyReducer.class);
                conf.setReducerClass(MyReducer.class);
                JobClient.runJob(conf);
        }
}


William Clay Moody
[hidden email]



Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Need some help? LongWritable cast to Text Error

Todd Lipcon-3
Hi William,

For TextInputFormat, the keys produced by the input format will be
LongWritables equal to the line number of the input line.

Hope that helps
-Todd

On Thu, 5 Jun 2008, William Clay Moody wrote:

> Newbie Question:
>
> I am trying to write a MapReduce from a text file in HDFS into a HBase table.
> I am getting the following run-time error
>
> 08/06/05 17:24:08 INFO mapred.JobClient: Task Id :
> task_200806041619_0009_m_000001_0, Status : FAILED
> java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be
> cast to org.apache.hadoop.io.Text
> at
> edu.ncsu.csc.osr.anansi.BuildTableMR$MyMapper.map(BuildTableMR.java:22)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:208)
> at
> org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2084)
>
> Line 22 of BuildTableMR.java is:
>
> public static class MyMapper extends MapReduceBase implements Mapper<Text,
> Text, Text, Text> {
>
> and map function is as follows:
>
> public void map(Text key, Text value, OutputCollector<Text, Text> output,
> Reporter reporter) throws IOException {
>
> LongWritable is not used throughout the entire program. Not sure what is
> going on since the Mapper Interface allows WritableComparables and Writables
> in which Text qualifies. I have seen the same use of Mapper in other provided
> samples.
>
> Entire code pasted below. Thanks in advance, Clay
>
> public class BuildTableMR extends Configured{
>
> public static class MyMapper extends MapReduceBase implements
> Mapper<Text, Text, Text, Text> {
> private Text src = new Text();
> private Text dest = new Text();
>
> public void map(Text key, Text value, OutputCollector<Text,
> Text> output, Reporter reporter) throws IOException {
> String line = value.toString(); if
> (line.length() == 0) {
> return;
> }
> String [] splits = value.toString().split(" ");
> src.set(splits[0]);
> dest.set(splits[1]);
> output.collect(src, dest);
> }
> }
>
> private static class MyReducer extends TableReduce<Text, Text> {
>
> private static final Text fam = new Text("out_edge:");
> private final static IntWritable one = new IntWritable(1);
> private MapWritable map = new MapWritable();
> public void reduce(Text key, Iterator<Text> values,
> OutputCollector<Text, MapWritable> output, Reporter reporter) throws
> IOException {
> Text qual = values.next();
> map.clear();
> Text col = new Text (fam.toString() +
> qual.toString());
> map.put(col, new
> ImmutableBytesWritable(qual.getBytes()));
>
> output.collect(new Text(key.toString()), map);
> }
> }
>
> public static void main(String[] args) throws IOException {
>
> String filename = args[0];
> int mapTasks = 2;
> int reduceTasks = 2;
> JobConf conf = new JobConf(BuildTableMR.class);
> conf.setJobName("BuildTable");
> TableReduce.initJob("anansi", MyReducer.class, conf);
> conf.setNumMapTasks(mapTasks);
> conf.setNumReduceTasks(reduceTasks);
> conf.setInputPath(new Path(filename));
> conf.setMapperClass(MyMapper.class);
> conf.setCombinerClass(MyReducer.class);
> conf.setReducerClass(MyReducer.class);
> JobClient.runJob(conf);
> }
> }
>
>
> William Clay Moody
> [hidden email]
>
>
>
Loading...