Projekt-Setup:SBT kann nicht importieren Kafka Encoder/Decoder Klassen
- 1 Produzent - serialises Objekte & Bytes Kafka
- 1 Funken Verbraucher sendet - sollte DefaultDecoder in Paket kafka.serializer verwenden zu konsumieren Bytes
Ausgabe:
- SBT importiert korrekte Bibliotheken (kafka-clients + kafka_2.10), aber kann keine Klassen im kafka_2.10 jar finden.
- Es scheint, als ob es unter dem falschen Pfad (org.apache.spark.streaming.kafka statt org.apache.kafka) sucht.
Fehlermeldung::
object serializer is not a member of package org.apache.spark.streaming.kafka [error]
import kafka.serializer.DefaultDecoder.
sbt Baum
[info] +-org.apache.spark:spark-streaming-kafka_2.10:1.6.1
[info] | +-org.apache.kafka:kafka_2.10:0.8.2.1 [S] <-- **DefaultDecoder is in here
but SBT can't find it (org.apache.kafka.serialization.DefaultDecoder)**
[info] | | +-org.apache.kafka:kafka-clients:0.8.2.1
built.sbt:
lazy val commonSettings = Seq(
organization := "org.RssReaderDemo",
version := "0.1.0",
scalaVersion := "2.10.6"
)
resolvers += "Artima Maven Repository" at "http://repo.artima.com/releases"
val spark = "org.apache.spark" % "spark-core_2.10" % "1.6.1"
val sparkStreaming = "org.apache.spark" % "spark-streaming_2.10" % "1.6.1"
val sparkStreamKafka = "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.1"
// Needed to be able to parse the generated avro JSON schema
val jacksonMapperAsl = "org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13"
val scalactic = "org.scalactic" %% "scalactic" % "2.2.6"
val scalatest = "org.scalatest" %% "scalatest" % "2.2.6" % "test"
val avro = "org.apache.avro" % "avro" % "1.8.0"
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
libraryDependencies += spark,
libraryDependencies += sparkStreaming,
libraryDependencies += sparkStreamKafka,
libraryDependencies += jacksonMapperAsl,
libraryDependencies += scalactic,
libraryDependencies += scalatest,
libraryDependencies += avro
)
-Code, der den Fehler in SBT verursacht: import kafka.serializer.DefaultDecoder – mds91