2016-07-18 12 views
0

Ich habe eine Cassandra-3.7-Version und Spark-1.6.2-Version auf einer Hadoop-2.7.2-Version. Ich habe versucht, Cassandra mit Spark zu integrieren. Zu diesem Zweck folgen Sie den Anweisungen in http://www.planetcassandra.org/blog/kindling-an-introduction-to-spark-with-cassandra/.Spark + Cassandra Integration

gemäß den Anweisungen, ich die Funken cassandra-Anschluss Projekt geklont:

git clone https://github.com/datastax/spark-cassandra-connector 

Sobald das Projekt geklont wird, änderte ich meine Arbeitsverzeichnis funken cassandra-Stecker:

cd spark-cassandra-connector/ 

In dem Funken cassandra-Anschluss Verzeichnis, gab ich den folgenden Befehl, um den Anschluss zu bauen:

./sbt/sbt assembly 

Aber auch der Bau nicht an die folgenden Fehler:

[error] 4 not found 
[error] /.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar 
[error] /.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar 
[error] /.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar 
[error] /.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar 

Die gesamte Ausgabe ist wie folgt:

Attempting to fetch sbt 
Launching sbt from sbt/sbt-launch-0.13.9.jar 
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=350m; support was removed in 8.0 
[info] Loading project definition from /usr/local/spark-cassandra-connector/project 
[info] Updating {file:/usr/local/spark-cassandra-connector/project/}spark-cassandra-connector-build... 
[info] Resolving org.eclipse.jgit#org.eclipse.jgit.archive;3.7.0.201502260915-r [info] Resolving org.scala-sbt.ivy#ivy;2.3.0-sbt-c5d1b95fdcc1e1007740ffbecf4eb07[info] Resolving org.fusesource.jansi#jansi;1.4 ... 
[info] Done updating. 
[warn] There may be incompatibilities among your library dependencies. 
[warn] Here are some of the libraries that were evicted: 
[warn] * net.virtual-void:sbt-dependency-graph:0.7.4 -> 0.8.2 
[warn] Run 'evicted' to see detailed eviction warnings 
[info] Compiling 6 Scala sources to /usr/local/spark-cassandra-connector/project/target/scala-2.10/sbt-0.13/classes... 
[warn] there were 4 feature warning(s); re-run with -feature for details 
[warn] one warning found 
Using releases: https://oss.sonatype.org/service/local/staging/deploy/maven2 for releases 
Using snapshots: https://oss.sonatype.org/content/repositories/snapshots for snapshots 

    Scala: 2.10.6 [To build against Scala 2.11 use '-Dscala-2.11=true'] 
    Scala Binary: 2.10 
    Java: target=1.7 user=1.8.0_77 
    Cassandra version for testing: 3.0.2 [can be overridden by specifying '-Dtest.cassandra.version=<version>'] 

[info] Set current project to root (in build file:/usr/local/spark-cassandra-connector/) 
[warn] Credentials file /.ivy2/.credentials does not exist 
[info] Formatting 46 Scala sources {file:/usr/local/spark-cassandra-connector/}spark-cassandra-connector(test) ... 
[info] Formatting 158 Scala sources {file:/usr/local/spark-cassandra-connector/}spark-cassandra-connector(compile) ... 
[warn] Credentials file /.ivy2/.credentials does not exist 
[warn] Credentials file /.ivy2/.credentials does not exist 
[info] Updating com.datastax.spark:spark-cassandra-connector_2.10:1.6.0-20-g4edacfb 
[info] Reformatted 44 Scala sources {file:/usr/local/spark-cassandra-connector/}spark-cassandra-connector(test). 
[info] Resolved com.datastax.spark:spark-cassandra-connector_2.10:1.6.0-20-g4edacfb dependencies 
[info] Fetching artifacts of com.datastax.spark:spark-cassandra-connector_2.10:1.6.0-20-g4edacfb 
[info] Reformatted 143 Scala sources {file:/usr/local/spark-cassandra-connector/}spark-cassandra-connector(compile). 
[info] Fetched artifacts of com.datastax.spark:spark-cassandra-connector_2.10:1.6.0-20-g4edacfb 
[error] 4 not found 
[error] /.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar 
[error] /.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar 
[error] /.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar 
[error] /.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar 
java.lang.Exception: Encountered 4 errors (see above messages) 
    at coursier.Tasks$$anonfun$updateTask$1.coursier$Tasks$$anonfun$$report$1(Tasks.scala:710) 
    at coursier.Tasks$$anonfun$updateTask$1$$anonfun$apply$49.apply(Tasks.scala:748) 
    at coursier.Tasks$$anonfun$updateTask$1$$anonfun$apply$49.apply(Tasks.scala:748) 
    at scala.collection.mutable.MapLike$class.getOrElseUpdate(MapLike.scala:189) 
    at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:91) 
    at coursier.Tasks$$anonfun$updateTask$1.apply(Tasks.scala:741) 
    at coursier.Tasks$$anonfun$updateTask$1.apply(Tasks.scala:517) 
    at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47) 
    at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40) 
    at sbt.std.Transform$$anon$4.work(System.scala:63) 
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) 
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) 
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17) 
    at sbt.Execute.work(Execute.scala:235) 
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) 
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) 
    at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159) 
    at sbt.CompletionService$$anon$2.call(CompletionService.scala:28) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
    at java.lang.Thread.run(Thread.java:745) 
[error] (spark-cassandra-connector/*:update) java.lang.Exception: Encountered 4 errors (see above messages) 
[error] Total time: 3 s, completed 19-Jul-2016 5:03:12 AM 

Was mache ich falsch? Könnte jemand helfen?

Antwort

3

Dies ist, weil Ihr m2 Verzeichnis andere Version von Bibliotheken haben kann und daher werden heruntergeladen sie nicht nur die Bibliotheken aus m2 löschen und dann erneut versuchen

+2

Dieser den Fehler behoben, nur gelöscht alles in ~/.m2 , Danke! – phcaze

Verwandte Themen