2017-08-21 1 views
0

Ich verwende Version von Datastax 5.1 für Cassandra in meinem lokalen Computer. Gestartet cassandra mitNoHostAvailableException beim Ausführen von Funke mit dse

dse cassandra -k 

Cassandra gestartet fein. Als nächstes wollte ich mit

dse spark 

gehen, aber es gibt mir die folgenden Fehler.

2017-08-21 12:11:25 [main] ERROR o.a.s.d.DseSparkSubmitBootstrapper - Failed to start or submit Spark application because of com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried) - see details in the log file(s): /home/rsahukar/.spark-shell.log 
com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried) 
    at com.datastax.driver.core.exceptions.NoHostAvailableException.copy(NoHostAvailableException.java:75) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.exceptions.NoHostAvailableException.copy(NoHostAvailableException.java:28) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:28) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:236) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:59) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:42) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.dse.DefaultDseSession.execute(DefaultDseSession.java:232) ~[dse-java-driver-core-1.2.2.jar:na] 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_131] 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_131] 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_131] 
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_131] 
    at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3] 
    at com.sun.proxy.$Proxy6.execute(Unknown Source) ~[na:na] 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_131] 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_131] 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_131] 
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_131] 
    at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3] 
    at com.sun.proxy.$Proxy7.execute(Unknown Source) ~[na:na] 
    at com.datastax.bdp.util.rpc.RpcUtil.call(RpcUtil.java:42) ~[dse-core-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$$anonfun$fetch$1.apply(SparkNodeConfiguration.scala:54) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$$anonfun$fetch$1.apply(SparkNodeConfiguration.scala:52) ~[dse-spark-5.1.2.jar:5.1.2] 
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:112) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3] 
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:111) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3] 
    at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:145) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3] 
    at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:111) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3] 
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:52) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkNodeConfiguration$.apply(SparkNodeConfiguration.scala:44) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkConfigurator$$anonfun$8.apply(SparkConfigurator.scala:85) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkConfigurator$$anonfun$8.apply(SparkConfigurator.scala:85) ~[dse-spark-5.1.2.jar:5.1.2] 
    at scala.util.Try$.apply(Try.scala:192) ~[scala-library-2.11.11.jar:na] 
    at com.datastax.bdp.util.Lazy.internal$lzycompute(Lazy.scala:26) ~[dse-spark-5.1.2.jar:5.1.2] 
    at com.datastax.bdp.util.Lazy.internal(Lazy.scala:25) ~[dse-spark-5.1.2.jar:5.1.2] 
    at com.datastax.bdp.util.Lazy.get(Lazy.scala:31) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkConfigurator.dseDriverProps$lzycompute(SparkConfigurator.scala:152) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkConfigurator.dseDriverProps(SparkConfigurator.scala:151) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries$lzycompute(SparkConfigurator.scala:124) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries(SparkConfigurator.scala:124) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs$lzycompute(DseSparkArgsPreprocessor.scala:79) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs(DseSparkArgsPreprocessor.scala:68) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:106) ~[dse-spark-5.1.2.jar:5.1.2] 
    at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala) [dse-spark-5.1.2.jar:5.1.2] 
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (no host was tried) 
    at com.datastax.driver.core.RequestHandler.reportNoMoreHosts(RequestHandler.java:204) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.RequestHandler.access$1000(RequestHandler.java:40) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.RequestHandler$SpeculativeExecution.findNextHostAndQuery(RequestHandler.java:268) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.RequestHandler.startNewExecution(RequestHandler.java:108) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.RequestHandler.sendRequest(RequestHandler.java:88) ~[dse-java-driver-core-1.2.2.jar:na] 
    at com.datastax.driver.core.SessionManager.executeAsync(SessionManager.java:124) ~[dse-java-driver-core-1.2.2.jar:na] 
    ... 43 common frames omitted 
2017-08-21 12:11:25 [Thread-1] ERROR o.a.s.d.DseSparkSubmitBootstrapper - Failed to cancel delegation token 

unten ist die dsetool Ringausgang

$ dsetool ring 
Address   DC     Rack   Workload    Graph Status State Load    Owns     Token          Health [0,1] 
127.0.0.1  Analytics   rack1  Analytics(SM)  no  Up  Normal 189.19 KiB  ?     5643405743002698980       0.50   

Kann mir jemand helfen?

+0

überprüfen, ob cassandra mit 'nodetool status' gestartet oder' dsetool ring' –

+0

Es wird gestartet. Der Ausgabe des dsetool-Rings wurde die Frage hinzugefügt – rahul

Antwort

0

Endlich habe ich meinen Fehler gefunden. Ich habe Cassandra im lokalen Modus laufen lassen. Und das war meine Funke conf-Datei (Funken defaults.conf) vor der Änderung

.... 
spark.cassandra.connection.local_dc  localhost 
spark.cassandra.connection.host   localhost 
.... 

Bitte beachten Sie die spark.cassandra.connection.local_dc Wert. Da ich es im lokalen Modus ausgeführt habe, dachte ich, sein Wert sollte auch localhost sein. Aber es sollte der DC-Name sein, was der dsetool Ring zurückgibt.

Unten ist mein dsetool Ringausgang

$ dsetool ring 
Address   DC     Rack   Workload    Graph Status State Load    Owns     Token          Health [0,1] 
127.0.0.1  Analytics   rack1  Analytics(SM)  no  Up  Normal 189.19 KiB  ?     5643405743002698980       0.50   

Wie wir oben sehen können, ist der DC-Wert Analytics. Also, musste den gleichen Wert in die Spark-Conf-Datei setzen. Unten ist der Code nach der Änderung

spark.cassandra.connection.local_dc  Analytics 
spark.cassandra.connection.host   localhost 
Verwandte Themen