2016-05-11 5 views
0

Wenn ich Mahout und Hadoop verwende, um eine Empfehlung zu geben, habe ich ein Problem festgestellt.falscher Wert Klasse: org.apache.mahout.math.VarLongWritable ist keine Klasse org.apache.mahout.math.VectorWritable

Die Fehler Massage ist:

Error: java.io.IOException: wrong value class: org.apache.mahout.math.VarLongWritable is not class org.apache.mahout.math.VectorWritable 
    at org.apache.hadoop.io.SequenceFile$Writer.append(SequenceFile.java:1378) 
    at org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat$1.write(SequenceFileOutputFormat.java:83) 
    at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:558) 
    at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) 
    at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.write(WrappedReducer.java:105) 
    at org.apache.hadoop.mapreduce.Reducer.reduce(Reducer.java:150) 
    at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:171) 
    at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:627) 
    at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389) 
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at javax.security.auth.Subject.doAs(Subject.java:415) 
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) 

Und Die Hauptfunktion ist:

job.setInputFormatClass (TextInputFormat.class);

job.setMapperClass(FilesToItemPrefsMapper.class); 
    job.setMapOutputKeyClass(VarLongWritable.class); 
    job.setMapOutputValueClass(VarLongWritable.class); 

    job.setReducerClass(FileToUserVectorReducer.class); 
    job.setOutputKeyClass(VarLongWritable.class); 
    job.setOutputValueClass(VectorWritable.class); 

    job.setOutputFormatClass(SequenceFileOutputFormat.class); 
    SequenceFileOutputFormat.setOutputCompressionType(job,CompressionType.NONE); 

Der Mapper ist:

public void map(LongWritable key, Text value, Context context) 
     throws IOException, InterruptedException { 
     String line = value.toString(); 
     Matcher m = NUMBERS.matcher(line); 
     m.find(); 
     VarLongWritable userID = new VarLongWritable(Long.parseLong(m.group())); 
     VarLongWritable itemID = new VarLongWritable(); 
     while (m.find()){ 
      itemID.set(Long.parseLong(m.group())); 
      context.write(userID, itemID); 
     } 

Der Druckminderer ist:

public class FileToUserVectorReducer 
     extends Reducer<VarLongWritable, VarLongWritable, VarLongWritable, VectorWritable> { 
    public void reducer(VarLongWritable userID, Iterable<VarLongWritable> itemPrefs, Context context) 
     throws IOException, InterruptedException{ 
     Vector userVector = new RandomAccessSparseVector(Integer.MAX_VALUE, 100); 
     for(VarLongWritable itemPref : itemPrefs){ 
      userVector.set((int)itemPref.get(), 1.0f); 
     } 
     context.write(userID, new VectorWritable(userVector)); 
    } 
} 

I der Wert Minderer denken, die VectorWritable ist in der job.setOutputValueClass (VectorWritable.class) gesetzt . Wenn ja, warum gibt es eine solche Fehlermeldung aus?

Antwort

0

Das Problem liegt in der Reducer-Funktion. Die Reduzierung (...) sollte reduziert werden, was bedeutet:

public class FileToUserVectorReducer 
     extends Reducer<VarLongWritable, VarLongWritable, VarLongWritable, VectorWritable> { 
    @Override 
    public void reduce(VarLongWritable userID, Iterable<VarLongWritable> itemPrefs, Context context) 
     throws IOException, InterruptedException{ 
     Vector userVector = new RandomAccessSparseVector(Integer.MAX_VALUE, 100); 
     for(VarLongWritable itemPref : itemPrefs){ 
      userVector.set((int)itemPref.get(), 1.0f); 
     } 
     context.write(userID, new VectorWritable(userVector)); 
    } 
} 

@Override ist sehr hilfreich. Wenn ich @Override verwende, würde es beim Kompilieren die Fehlermeldung ausgeben. Ich dachte, dass es zuerst unnötig ist, aber diese Erfahrung zeigt seinen Wert.