2017-12-26 9 views
1

Ersten folgende Probleme bei den Daten in Instanziieren lesen mit Spark2 mit Cluster-Modus. „java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':“ Ich bin absolut ahnungslos über dieses Thema nach vielen googeln. Bitte helfen Sie.Spark2 Datenlast Ausgabe HiveSessionState

Der Code, den ich

spark = SparkSession.builder.getOrCreate(); 

val lines: Dataset[String] = spark.read.textFile("/data/sample/abc.csv"). 

Ausnahme kommt von oben Zeile ausgeführt haben.

Ausnahme vollen Stack-Trace:

ERROR yarn.ApplicationMaster: User class threw exception: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState': 
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState': 
    at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981) 
    at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110) 
    at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109) 
    at org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:549) 
    at org.apache.spark.sql.SparkSession.read(SparkSession.scala:605) 
    at com.abcd.Learning$.main(Learning.scala:26) 
    at com.abcd.Learning.main(Learning.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:646) 
Caused by: java.lang.reflect.InvocationTargetException 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
    at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978) 
    ... 11 more 
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog': 
    at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169) 
    at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86) 
    at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101) 
    at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101) 
    at scala.Option.getOrElse(Option.scala:121) 
    at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101) 
    at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100) 
    at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157) 
    at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32) 
    ... 16 more 
Caused by: java.lang.reflect.InvocationTargetException 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
    at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166) 
    ... 24 more 
Caused by: java.lang.reflect.InvocationTargetException 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264) 
    at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:353) 
    at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:257) 
    at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:66) 
    ... 29 more 
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.pepperdata.supervisor.agent.resource.LocalFileSystemWrapper not found 
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:548) 
    at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:188) 
    ... 37 more 
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.pepperdata.supervisor.agent.resource.LocalFileSystemWrapper not found 
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199) 
    at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2685) 
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2705) 
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:97) 
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2748) 
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2730) 
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:385) 
    at org.apache.hadoop.fs.FileSystem.getLocal(FileSystem.java:356) 
    at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:666) 
    at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:593) 
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:526) 
    ... 38 more 
Caused by: java.lang.ClassNotFoundException: Class com.pepperdata.supervisor.agent.resource.LocalFileSystemWrapper not found 
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105) 
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197) 
    ... 48 more 
+0

haben Sie eine Lösung für das obige Problem gefunden? – yAsH

Antwort

0

ähnlich wie bei der Lösung gegeben here für mich gearbeitet.

Ich habe die folgenden

  • die Funken Gläser Verzeichnis hier Reißverschluss: /usr/local/Cellar/apache-spark/2.1.0/libexec/jars, und nannte es spark-jars.zip
  • die spark-jars.zip zu hdfs kopiert: $ hdfs dfs -copyFromLocal /usr/local/Cellar/apache-spark/2.1.0/libexec/spark-jars.zip hdfs:/user/<username>/
  • bestand den spark-jars.zip Standort in der Konfiguration, während der Ausführung Funken Job: $ HADOOP_CONF_DIR=/Users/<username>/hadoop_conf spark-submit --conf spark.yarn.archive=hdfs:/user/<username>/spark-jars.zip --conf spark.dynamicAllocation.enabled=true --conf spark.shuffle.service.enabled=true --class "com.<whatever>.<package>" --master yarn --deploy-mode cluster --queue online1 --driver-memory 3G --executor-memory 3G ./build/libs/<main class>.jar
+0

Danke für Ihre Antwort. Aber für mich war das Problem anders. Einige Konfigurationen wurden schlecht manipuliert, was dazu führte, dass die gesamte Umgebung beschädigt wurde. Jetzt wurde es gelöst. –

Verwandte Themen