2016-04-02 3 views
1

Ich versuche, den Funken zu bauen, um die Programme zu starten, aber es scheint nicht zu funktionieren. Dies ist, was passiert, wenn ich versuche, Beispielprogramm in Funkens auszuführen:Ich versuche, den Funken zu bauen, um die Programme zu starten, aber es scheint nicht zu funktionieren

[email protected]:/usr/local/spark$ ./bin/run-example SparkPi 10 
    Failed to find Spark examples assembly in /usr/local/spark/lib or /usr/local/spark/examples/target 
    You need to build Spark before running this program 
    [email protected]:/usr/local/spark$ sudo build/mvn -e -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean package 
    Using `mvn` from path: /usr/bin/mvn 
    Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0 
    [INFO] Error stacktraces are turned on. 
    [INFO] Scanning for projects... 
    [INFO] 

[INFO] Reactor Build Order: 
[INFO] 
[INFO] Spark Project Parent POM 
[INFO] Spark Project Test Tags 
[INFO] Spark Project Launcher 
[INFO] Spark Project Networking 
[INFO] Spark Project Shuffle Streaming Service 
[INFO] Spark Project Unsafe 
[INFO] Spark Project Core 
[INFO] Spark Project Bagel 
[INFO] Spark Project GraphX 
[INFO] Spark Project Streaming 
[INFO] Spark Project Catalyst 
[INFO] Spark Project SQL 
[INFO] Spark Project ML Library 
[INFO] Spark Project Tools 
[INFO] Spark Project Hive 
[INFO] Spark Project Docker Integration Tests 
[INFO] Spark Project REPL 
[INFO] Spark Project YARN Shuffle Service 
[INFO] Spark Project YARN 
[INFO] Spark Project Assembly 
[INFO] Spark Project External Twitter 
[INFO] Spark Project External Flume Sink 
[INFO] Spark Project External Flume 
[INFO] Spark Project External Flume Assembly 
[INFO] Spark Project External MQTT 
[INFO] Spark Project External MQTT Assembly 
[INFO] Spark Project External ZeroMQ 
[INFO] Spark Project External Kafka 
[INFO] Spark Project Examples 
[INFO] Spark Project External Kafka Assembly 
[INFO]                   
[INFO] ------------------------------------------------------------------------ 
[INFO] Building Spark Project Parent POM 1.6.1 
[INFO] ------------------------------------------------------------------------ 
[INFO] 
[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @ spark-parent_2.10 --- 
[INFO] Deleting /usr/local/spark/target 
[INFO] 
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-versions) @ spark-parent_2.10 --- 
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @ spark-parent_2.10 --- 
[INFO] Add Source directory: /usr/local/spark/src/main/scala 
[INFO] Add Test Source directory: /usr/local/spark/src/test/scala 
[INFO] 
[INFO] --- maven-dependency-plugin:2.10:build-classpath (default-cli) @ spark-parent_2.10 --- 
[INFO] Dependencies classpath: 
/root/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar 
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-parent_2.10 --- 
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-parent_2.10 --- 
[INFO] No sources to compile 
[INFO] 
[INFO] --- maven-antrun-plugin:1.8:run (create-tmp-dir) @ spark-parent_2.10 --- 
[INFO] Executing tasks 

main: 
    [mkdir] Created dir: /usr/local/spark/target/tmp 
[INFO] Executed tasks 
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ spark-parent_2.10 --- 
[INFO] No sources to compile 
[INFO] 
[INFO] --- maven-dependency-plugin:2.10:build-classpath (default) @ spark-parent_2.10 --- 
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ spark-parent_2.10 --- 
[INFO] Tests are skipped. 
[INFO] 
[INFO] --- maven-jar-plugin:2.6:test-jar (prepare-test-jar) @ spark-parent_2.10 --- 
[INFO] Building jar: /usr/local/spark/target/spark-parent_2.10-1.6.1-tests.jar 
[INFO] 
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ spark-parent_2.10 --- 
[INFO] 
[INFO] --- maven-shade-plugin:2.4.1:shade (default) @ spark-parent_2.10 --- 
[INFO] Including org.spark-project.spark:unused:jar:1.0.0 in the shaded jar. 
[INFO] Replacing original artifact with shaded artifact. 
[INFO] 
[INFO] --- maven-source-plugin:2.4:jar-no-fork (create-source-jar) @ spark-parent_2.10 --- 
[INFO] 
[INFO] --- maven-source-plugin:2.4:test-jar-no-fork (create-source-jar) @ spark-parent_2.10 --- 
[INFO]                   
[INFO] ------------------------------------------------------------------------ 
[INFO] Building Spark Project Test Tags 1.6.1 
[INFO] ------------------------------------------------------------------------ 
[INFO] 
[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @ spark-test-tags_2.10 --- 
[INFO] Deleting /usr/local/spark/tags/target 
[INFO] 
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-versions) @ spark-test-tags_2.10 --- 
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @ spark-test-tags_2.10 --- 
[INFO] Add Source directory: /usr/local/spark/tags/src/main/scala 
[INFO] Add Test Source directory: /usr/local/spark/tags/src/test/scala 
[INFO] 
[INFO] --- maven-dependency-plugin:2.10:build-classpath (default-cli) @ spark-test-tags_2.10 --- 
[INFO] Dependencies classpath: 
/root/.m2/repository/org/scala-lang/scala-reflect/2.10.5/scala-reflect-2.10.5.jar:/root/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/root/.m2/repository/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.jar:/root/.m2/repository/org/scalatest/scalatest_2.10/2.2.1/scalatest_2.10-2.2.1.jar 
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-test-tags_2.10 --- 
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-test-tags_2.10 --- 
[INFO] Using 'UTF-8' encoding to copy filtered resources. 
[INFO] skip non existing resourceDirectory /usr/local/spark/tags/src/main/resources 
[INFO] Copying 3 resources 
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-test-tags_2.10 --- 
[INFO] Using zinc server for incremental compilation 
[error] Required file not found: scala-compiler-2.10.5.jar 
[error] See zinc -help for information about locating necessary files 
[INFO] ------------------------------------------------------------------------ 
[INFO] Reactor Summary: 
[INFO] 
[INFO] Spark Project Parent POM ........................... SUCCESS [ 3.566 s] 
[INFO] Spark Project Test Tags ............................ FAILURE [ 0.466 s] 
[INFO] Spark Project Launcher ............................. SKIPPED 
[INFO] Spark Project Networking ........................... SKIPPED 
[INFO] Spark Project Shuffle Streaming Service ............ SKIPPED 
[INFO] Spark Project Unsafe ............................... SKIPPED 
[INFO] Spark Project Core ................................. SKIPPED 
[INFO] Spark Project Bagel ................................ SKIPPED 
[INFO] Spark Project GraphX ............................... SKIPPED 
[INFO] Spark Project Streaming ............................ SKIPPED 
[INFO] Spark Project Catalyst ............................. SKIPPED 
[INFO] Spark Project SQL .................................. SKIPPED 
[INFO] Spark Project ML Library ........................... SKIPPED 
[INFO] Spark Project Tools ................................ SKIPPED 
[INFO] Spark Project Hive ................................. SKIPPED 
[INFO] Spark Project Docker Integration Tests ............. SKIPPED 
[INFO] Spark Project REPL ................................. SKIPPED 
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED 
[INFO] Spark Project YARN ................................. SKIPPED 
[INFO] Spark Project Assembly ............................. SKIPPED 
[INFO] Spark Project External Twitter ..................... SKIPPED 
[INFO] Spark Project External Flume Sink .................. SKIPPED 
[INFO] Spark Project External Flume ....................... SKIPPED 
[INFO] Spark Project External Flume Assembly .............. SKIPPED 
[INFO] Spark Project External MQTT ........................ SKIPPED 
[INFO] Spark Project External MQTT Assembly ............... SKIPPED 
[INFO] Spark Project External ZeroMQ ...................... SKIPPED 
[INFO] Spark Project External Kafka ....................... SKIPPED 
[INFO] Spark Project Examples ............................. SKIPPED 
[INFO] Spark Project External Kafka Assembly .............. SKIPPED 
[INFO] ------------------------------------------------------------------------ 
[INFO] BUILD FAILURE 
[INFO] ------------------------------------------------------------------------ 
[INFO] Total time: 5.128 s 
[INFO] Finished at: 2016-04-02T22:46:45+05:30 
[INFO] Final Memory: 38M/223M 
[INFO] ------------------------------------------------------------------------ 
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-test-tags_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed -> [Help 1] 
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-test-tags_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. 
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:224) 
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) 
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) 
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116) 
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80) 
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51) 
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128) 
    at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307) 
    at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193) 
    at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106) 
    at org.apache.maven.cli.MavenCli.execute(MavenCli.java:862) 
    at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:286) 
    at org.apache.maven.cli.MavenCli.main(MavenCli.java:197) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289) 
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229) 
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415) 
    at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356) 
Caused by: org.apache.maven.plugin.PluginExecutionException: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. 
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:145) 
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208) 
    ... 20 more 
Caused by: Compile failed via zinc server 
    at sbt_inc.SbtIncrementalCompiler.zincCompile(SbtIncrementalCompiler.java:136) 
    at sbt_inc.SbtIncrementalCompiler.compile(SbtIncrementalCompiler.java:86) 
    at scala_maven.ScalaCompilerSupport.incrementalCompile(ScalaCompilerSupport.java:303) 
    at scala_maven.ScalaCompilerSupport.compile(ScalaCompilerSupport.java:119) 
    at scala_maven.ScalaCompilerSupport.doExecute(ScalaCompilerSupport.java:99) 
    at scala_maven.ScalaMojoSupport.execute(ScalaMojoSupport.java:482) 
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134) 
    ... 21 more 
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging. 
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles: 
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException 
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command 
[ERROR] mvn <goals> -rf :spark-test-tags_2.10 

ich die neuesten Maven habe:

[email protected]:/usr/local$ mvn -version 
Apache Maven 3.3.3 
Maven home: /usr/share/maven 
Java version: 1.8.0_77, vendor: Oracle Corporation 
Java home: /home/ankit/Downloads/jdk1.8.0_77/jre 
Default locale: en_IN, platform encoding: UTF-8 

Im Folgenden werden die Pfade in meinem bashrc erwähnt file:

export JAVA_HOME=/home/ankit/Downloads/jdk1.8.0_77 

export HADOOP_HOME=/home/ankit/Downloads/hadoop 
export HADOOP_MAPRED_HOME=$HADOOP_HOME 
export HADOOP_COMMON_HOME=$HADOOP_HOME 
export HADOOP_HDFS_HOME=$HADOOP_HOME 
export YARN_HOME=$HADOOP_HOME 
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native 
#export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin 
export HADOOP_INSTALL=$HADOOP_HOME 
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native" 
export CASSANDRA_HOME =$CASSANDRA_HOME:/home/hduser_/cassandra 
#export PATH = $PATH:$CASSANDRA_HOME/bin 
export SCALA_HOME = $SCALA_HOME:/usr/local/scala 
export PATH = $SCALA_HOME/bin:$PATH 

ich bin ne W to SOF, könnte jemand bitte beraten?

+0

Er sagt: „Required-Datei wurde nicht gefunden: scala-Compiler-2.10 .5.jar ". Bist du sicher, dass mvn es herunterladen konnte? Könnten Sie das bitte überprüfen? – evgenii

+0

@fathersson Wie überprüfe ich das? –

+0

Sie könnten in Ihr lokales maven Repo schauen. Normalerweise befindet es sich in Ihrem Home-Verzeichnis unter .m2, verwenden Sie einfach das Dienstprogramm find, um danach zu suchen. – evgenii

Antwort

1

In Maven 3, wenn Sie gerade einen fehlgeschlagenen Download hatten und ihn repariert haben (z. B. durch Hochladen des JAR in ein Repository), wird der Fehler zwischengespeichert. Um eine Aktualisierung zu erzwingen, fügen Sie -U zur Befehlszeile hinzu. Versuchen Sie, sich zu erneuern und lassen Sie mich wissen, wie es gehen wird.

Wenn Sie bereits gescheitert bauen, sobald Sie Refresh erzwingen müssen mit Maven 3: Der Befehl sollte sein (man beachte die -U Option):

mvn -U -DskipTests clean package 
Verwandte Themen