2017-02-07 6 views
1

Ich versuche, funken Shell arbeitet mit GARN zu machen, aber wenn ich versuche, den Code wie unten ausgeführt wird:setup/laufen Funke (Funke-Shell) auf Garn-Client-Modus

spark-shell \ 
    --master yarn \ 
    --deploy-mode client \ 
    --driver-memory 1g \ 
    --executor-memory 1g \ 
    --executor-cores 1 

Die stacktrace Ich habe Kontext nicht über Funken Bibliotheken in es beim laufen im Client-Modus sieht

17/02/07 01:52:41 ERROR spark.SparkContext: Error initializing SparkContext. 
org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master. 
    at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:85) 
    at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62) 
    at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:509) 
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) 
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) 
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) 
    at scala.Option.getOrElse(Option.scala:121) 
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) 
    at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95) 
    at $line3.$read$$iw$$iw.<init>(<console>:15) 
    at $line3.$read$$iw.<init>(<console>:42) 
    at $line3.$read.<init>(<console>:44) 
    at $line3.$read$.<init>(<console>:48) 
    at $line3.$read$.<clinit>(<console>) 
    at $line3.$eval$.$print$lzycompute(<console>:7) 
    at $line3.$eval$.$print(<console>:6) 
    at $line3.$eval.$print(<console>) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786) 
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047) 
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638) 
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637) 
    at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31) 
    at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19) 
    at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637) 
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569) 
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565) 
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807) 
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681) 
    at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
    at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) 
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37) 
    at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) 
    at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) 
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909) 
    at org.apache.spark.repl.Main$.doMain(Main.scala:68) 
    at org.apache.spark.repl.Main$.main(Main.scala:51) 
    at org.apache.spark.repl.Main.main(Main.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738) 
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
17/02/07 01:52:42 ERROR client.TransportClient: Failed to send RPC 7605279734171920371 to /172.17.0.2:35136: java.nio.channels.ClosedChannelException 
java.nio.channels.ClosedChannelException 
    at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source) 
17/02/07 01:52:42 ERROR cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Sending RequestExecutors(0,0,Map()) to AM was unsuccessful 
java.io.IOException: Failed to send RPC 7605279734171920371 to /172.17.0.2:35136: java.nio.channels.ClosedChannelException 
    at org.apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.java:249) 
    at org.apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.java:233) 
    at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:514) 
    at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:488) 
    at io.netty.util.concurrent.DefaultPromise.access$000(DefaultPromise.java:34) 
    at io.netty.util.concurrent.DefaultPromise$1.run(DefaultPromise.java:438) 
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408) 
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:455) 
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) 
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) 
    at java.lang.Thread.run(Thread.java:745) 
Caused by: java.nio.channels.ClosedChannelException 
    at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source) 
17/02/07 01:52:42 ERROR util.Utils: Uncaught exception in thread main 
org.apache.spark.SparkException: Exception thrown in awaitResult 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75) 
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167) 
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83) 
    at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.requestTotalExecutors(CoarseGrainedSchedulerBackend.scala:512) 
    at org.apache.spark.scheduler.cluster.YarnSchedulerBackend.stop(YarnSchedulerBackend.scala:93) 
    at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.stop(YarnClientSchedulerBackend.scala:151) 
    at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:467) 
    at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1588) 
    at org.apache.spark.SparkContext$$anonfun$stop$8.apply$mcV$sp(SparkContext.scala:1826) 
    at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1283) 
    at org.apache.spark.SparkContext.stop(SparkContext.scala:1825) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:587) 
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) 
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) 
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) 
    at scala.Option.getOrElse(Option.scala:121) 
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) 
    at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95) 
    at $line3.$read$$iw$$iw.<init>(<console>:15) 
    at $line3.$read$$iw.<init>(<console>:42) 
    at $line3.$read.<init>(<console>:44) 
    at $line3.$read$.<init>(<console>:48) 
    at $line3.$read$.<clinit>(<console>) 
    at $line3.$eval$.$print$lzycompute(<console>:7) 
    at $line3.$eval$.$print(<console>:6) 
    at $line3.$eval.$print(<console>) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786) 
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047) 
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638) 
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637) 
    at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31) 
    at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19) 
    at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637) 
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569) 
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565) 
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807) 
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681) 
    at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37) 
    at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) 
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37) 
    at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) 
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909) 
    at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97) 
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909) 
    at org.apache.spark.repl.Main$.doMain(Main.scala:68) 
    at org.apache.spark.repl.Main$.main(Main.scala:51) 
    at org.apache.spark.repl.Main.main(Main.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738) 
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
Caused by: java.io.IOException: Failed to send RPC 7605279734171920371 to /172.17.0.2:35136: java.nio.channels.ClosedChannelException 
    at org.apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.java:249) 
    at org.apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.java:233) 
    at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:514) 
    at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:488) 
    at io.netty.util.concurrent.DefaultPromise.access$000(DefaultPromise.java:34) 
    at io.netty.util.concurrent.DefaultPromise$1.run(DefaultPromise.java:438) 
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408) 
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:455) 
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) 
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) 
    at java.lang.Thread.run(Thread.java:745) 
Caused by: java.nio.channels.ClosedChannelException 
    at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source) 
17/02/07 01:52:42 WARN metrics.MetricsSystem: Stopping a MetricsSystem that is not running 
org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master. 
    at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:85) 
    at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62) 
    at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156) 
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:509) 
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) 
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) 
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) 
    at scala.Option.getOrElse(Option.scala:121) 
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) 
    at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95) 
    ... 47 elided 
<console>:14: error: not found: value spark 
     import spark.implicits._ 
      ^
<console>:14: error: not found: value spark 
     import spark.sql 
      ^

Für mich eher wie GARN: ist. Ich habe versucht, die $SPARK_HOME/jars in der Garn-site.xml-Datei innerhalb der yarn.application.classpath Konfiguration hinzuzufügen, aber das scheint auch nicht zu funktionieren.

Wie das mit Funkenschale und Garn funktionieren könnte? Ich habe versucht, eine offizielle Anleitung in Bezug auf die Behandlung dieses Problems zu finden, aber es wird keine Anleitung dazu gefunden.

+0

Meinten Sie '--master gam --deploy-mode client'? –

+0

ja ich habe das auch versucht und kein Glück, ich werde die Frage ändern –

Antwort

0

Es ist eigentlich der umgekehrte Weg, wie ich beschrieben habe. Aufgrund des Verbindungsfehlers verursacht durch Yarn + Java 8 Problem (https://issues.apache.org/jira/browse/YARN-4714). Der sparkContext konnte nicht erstellt werden.

die Änderung sollte den folgenden Abschnitt in der Garn-Datei site.xml werden hinzufügen:

<property> 
    <name>yarn.nodemanager.pmem-check-enabled</name> 
    <value>false</value> 
</property> 

<property> 
    <name>yarn.nodemanager.vmem-check-enabled</name> 
    <value>false</value> 
</property> 
+0

Hat das dein Problem gelöst? Ich habe genau das gleiche Problem, aber diese Änderung hat mir nicht geholfen. – Alessandro

+0

Diese Änderung löst mein Problem, aber ich bin mir nicht ganz sicher. Bitte folgen Sie dem JIRA-Ticket oben, um zu sehen, ob eine Lösung helfen könnte. –

+0

ist es notwendig, neu zu starten - Garn oder andere Dienstleistung nach dem Wechseln der Garnseite.xml? – xhudik

1

Ich habe diese ERROR spark.SparkContext behoben: Fehler SparkContext initialisiert.

Mein Cluster ist Spark 1.6.2 und Hadoop 2.6. gleiches Problem, wenn ich Funken Shell --master Garn-Client

  1. Kopie Funken Montage-1.6.2-hadoop2.6.0.jar von den lokalen bis hdfs wie hdfs kick off: // Master: 9000/Funken/Funken Montage-1.6.2-hadoop2.6.0.jar

  2. in funken defaults.conf hinzufügen hdfs folgende Parameter spark.yarn.jars: // Master: 9000/Funken/Funken Montage -1.6.2-hadoop2.6.0.jar

dann das Problem behoben.