2015-03-02 9 views
5

Ich möchte Lambda-Funktion verwenden, um den Durchschnitt von Schlüssel (JavaPairRDD<Integer, Double> pairs) zu berechnen. Aus diesem Grund entwickelte ich den folgenden Code:Spark Combibybykey JAVA Lambda-Ausdruck

java.util.function.Function<Double, Tuple2<Double, Integer>> createAcc = x -> new Tuple2<Double, Integer>(x, 1); 

BiFunction<Tuple2<Double, Integer>, Double, Tuple2<Double, Integer>> addAndCount = (Tuple2<Double, Integer> x, Double y) -> { return new Tuple2(x._1()+y, x._2()+1); }; 

BiFunction<Tuple2<Double, Integer>, Tuple2<Double, Integer>, Tuple2<Double, Integer>> combine = (Tuple2<Double, Integer> x, Tuple2<Double, Integer> y) -> { return new Tuple2(x._1()+y._1(), x._2()+y._2()); }; 

JavaPairRDD<Integer, Tuple2<Double, Integer>> avgCounts = pairs.combineByKey(createAcc, addAndCount, combine); 

jedoch Eclipse diplays diese Fehler:

The method combineByKey(Function<Double,C>, Function2<C,Double,C>, Function2<C,C,C>) in the type JavaPairRDD<Integer,Double> is not applicable for the arguments (Function<Double,Tuple2<Double,Integer>>, 
BiFunction<Tuple2<Double,Integer>,Double,Tuple2<Double,Integer>>, BiFunction<Tuple2<Double,Integer>,Tuple2<Double,Integer>,Tuple2<Double,Integer>>) 
+1

Versuchen java.util.function.BiFunction ersetzt durch org.apache.spark.api.java.function.Function2 –

+0

Vielen Dank! Das hat das Problem gelöst. – Wassim

Antwort

5

Die combineByKey Methode org.apache.spark.api.java.function.Function2 statt java.util.function.BiFunction erwartet. Also entweder Sie schreiben:

java.util.function.Function<Double, Tuple2<Double, Integer>> createAcc = 
    x -> new Tuple2<Double, Integer>(x, 1); 

Function2<Tuple2<Double, Integer>, Double, Tuple2<Double, Integer>> addAndCount = 
    (Tuple2<Double, Integer> x, Double y) -> { return new Tuple2(x._1()+y, x._2()+1); }; 

Function2<Tuple2<Double, Integer>, Tuple2<Double, Integer>, Tuple2<Double, Integer>> combine = 
    (Tuple2<Double, Integer> x, Tuple2<Double, Integer> y) -> { return new Tuple2(x._1()+y._1(), x._2()+y._2()); }; 

JavaPairRDD<Integer, Tuple2<Double, Integer>> avgCounts = 
    pairs.combineByKey(createAcc, addAndCount, combine); 
+0

Bitte aktualisieren Sie auch die erste, es sollte org.apache.spark.api.java.function.Function – Wassim

+1

ja, java.util.function ist falsch. muss aktualisiert werden –